Mar 18 13:59:57 crc systemd[1]: Starting Kubernetes Kubelet... Mar 18 13:59:57 crc restorecon[4708]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:57 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:59:58 crc restorecon[4708]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 13:59:58 crc restorecon[4708]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 18 13:59:59 crc kubenswrapper[4756]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 13:59:59 crc kubenswrapper[4756]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 13:59:59 crc kubenswrapper[4756]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 13:59:59 crc kubenswrapper[4756]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 13:59:59 crc kubenswrapper[4756]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 18 13:59:59 crc kubenswrapper[4756]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.097352 4756 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101110 4756 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101157 4756 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101164 4756 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101169 4756 feature_gate.go:330] unrecognized feature gate: Example Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101175 4756 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101182 4756 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101188 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101193 4756 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101198 4756 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101212 4756 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101219 4756 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101226 4756 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101231 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101236 4756 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101241 4756 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101246 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101252 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101257 4756 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101263 4756 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101270 4756 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101282 4756 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101292 4756 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101301 4756 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101309 4756 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101316 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101321 4756 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101327 4756 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101332 4756 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101337 4756 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101342 4756 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101350 4756 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101359 4756 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101365 4756 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101371 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101376 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101382 4756 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101387 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101392 4756 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101397 4756 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101402 4756 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101407 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101412 4756 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101417 4756 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101422 4756 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101427 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101434 4756 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101439 4756 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101444 4756 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101450 4756 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101454 4756 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101459 4756 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101464 4756 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101469 4756 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101475 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101480 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101486 4756 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101491 4756 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101496 4756 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101502 4756 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101506 4756 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101511 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101516 4756 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101521 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101527 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101532 4756 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101538 4756 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101543 4756 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101548 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101555 4756 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101562 4756 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.101568 4756 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102425 4756 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102444 4756 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102458 4756 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102467 4756 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102474 4756 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102481 4756 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102489 4756 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102501 4756 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102507 4756 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102513 4756 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102520 4756 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102527 4756 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102533 4756 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102540 4756 flags.go:64] FLAG: --cgroup-root="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102546 4756 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102552 4756 flags.go:64] FLAG: --client-ca-file="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102559 4756 flags.go:64] FLAG: --cloud-config="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102567 4756 flags.go:64] FLAG: --cloud-provider="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102579 4756 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102596 4756 flags.go:64] FLAG: --cluster-domain="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102604 4756 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102612 4756 flags.go:64] FLAG: --config-dir="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102619 4756 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102627 4756 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102638 4756 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102646 4756 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102654 4756 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102663 4756 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102670 4756 flags.go:64] FLAG: --contention-profiling="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102678 4756 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102686 4756 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102694 4756 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102701 4756 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102711 4756 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102719 4756 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102727 4756 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102734 4756 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102742 4756 flags.go:64] FLAG: --enable-server="true" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102750 4756 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102762 4756 flags.go:64] FLAG: --event-burst="100" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102769 4756 flags.go:64] FLAG: --event-qps="50" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102777 4756 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102783 4756 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102789 4756 flags.go:64] FLAG: --eviction-hard="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102796 4756 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102803 4756 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102810 4756 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102816 4756 flags.go:64] FLAG: --eviction-soft="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102822 4756 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102828 4756 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102835 4756 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102841 4756 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102846 4756 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102852 4756 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102858 4756 flags.go:64] FLAG: --feature-gates="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102866 4756 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102872 4756 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102878 4756 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102885 4756 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102891 4756 flags.go:64] FLAG: --healthz-port="10248" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102897 4756 flags.go:64] FLAG: --help="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102903 4756 flags.go:64] FLAG: --hostname-override="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102909 4756 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102915 4756 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102921 4756 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102929 4756 flags.go:64] FLAG: --image-credential-provider-config="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102935 4756 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102941 4756 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102948 4756 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102953 4756 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102959 4756 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102966 4756 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102972 4756 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102977 4756 flags.go:64] FLAG: --kube-reserved="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102983 4756 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102989 4756 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.102996 4756 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103001 4756 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103008 4756 flags.go:64] FLAG: --lock-file="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103014 4756 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103020 4756 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103026 4756 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103037 4756 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103043 4756 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103049 4756 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103055 4756 flags.go:64] FLAG: --logging-format="text" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103062 4756 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103068 4756 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103074 4756 flags.go:64] FLAG: --manifest-url="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103080 4756 flags.go:64] FLAG: --manifest-url-header="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103088 4756 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103094 4756 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103101 4756 flags.go:64] FLAG: --max-pods="110" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103107 4756 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103136 4756 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103144 4756 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103151 4756 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103158 4756 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103165 4756 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103172 4756 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103185 4756 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103191 4756 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103198 4756 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103204 4756 flags.go:64] FLAG: --pod-cidr="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103210 4756 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103220 4756 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103226 4756 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103232 4756 flags.go:64] FLAG: --pods-per-core="0" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103238 4756 flags.go:64] FLAG: --port="10250" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103244 4756 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103251 4756 flags.go:64] FLAG: --provider-id="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103259 4756 flags.go:64] FLAG: --qos-reserved="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103267 4756 flags.go:64] FLAG: --read-only-port="10255" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103282 4756 flags.go:64] FLAG: --register-node="true" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103293 4756 flags.go:64] FLAG: --register-schedulable="true" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103301 4756 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103314 4756 flags.go:64] FLAG: --registry-burst="10" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103321 4756 flags.go:64] FLAG: --registry-qps="5" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103329 4756 flags.go:64] FLAG: --reserved-cpus="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103336 4756 flags.go:64] FLAG: --reserved-memory="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103343 4756 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103349 4756 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103356 4756 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103362 4756 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103368 4756 flags.go:64] FLAG: --runonce="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103374 4756 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103380 4756 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103386 4756 flags.go:64] FLAG: --seccomp-default="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103392 4756 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103400 4756 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103406 4756 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103412 4756 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103418 4756 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103424 4756 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103430 4756 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103436 4756 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103442 4756 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103449 4756 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103455 4756 flags.go:64] FLAG: --system-cgroups="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103461 4756 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103471 4756 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103476 4756 flags.go:64] FLAG: --tls-cert-file="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103483 4756 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103491 4756 flags.go:64] FLAG: --tls-min-version="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103496 4756 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103502 4756 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103508 4756 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103514 4756 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103521 4756 flags.go:64] FLAG: --v="2" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103528 4756 flags.go:64] FLAG: --version="false" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103536 4756 flags.go:64] FLAG: --vmodule="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103543 4756 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.103549 4756 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103691 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103708 4756 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103717 4756 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103723 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103730 4756 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103735 4756 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103742 4756 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103747 4756 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103753 4756 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103759 4756 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103764 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103770 4756 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103775 4756 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103780 4756 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103785 4756 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103790 4756 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103795 4756 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103801 4756 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103806 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103811 4756 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103816 4756 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103821 4756 feature_gate.go:330] unrecognized feature gate: Example Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103827 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103832 4756 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103837 4756 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103842 4756 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103847 4756 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103854 4756 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103859 4756 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103864 4756 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103869 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103874 4756 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103882 4756 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103888 4756 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103894 4756 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103900 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103907 4756 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103915 4756 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103922 4756 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103928 4756 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103935 4756 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103941 4756 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103948 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103954 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103959 4756 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103964 4756 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103969 4756 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103974 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103980 4756 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103985 4756 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103990 4756 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.103995 4756 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104000 4756 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104005 4756 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104010 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104016 4756 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104020 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104026 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104031 4756 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104036 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104041 4756 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104046 4756 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104052 4756 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104057 4756 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104062 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104067 4756 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104072 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104077 4756 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104082 4756 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104144 4756 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.104151 4756 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.105321 4756 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.116537 4756 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.116584 4756 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116673 4756 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116690 4756 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116695 4756 feature_gate.go:330] unrecognized feature gate: Example Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116699 4756 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116704 4756 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116712 4756 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116718 4756 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116723 4756 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116729 4756 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116733 4756 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116739 4756 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116745 4756 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116752 4756 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116757 4756 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116762 4756 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116767 4756 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116772 4756 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116777 4756 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116783 4756 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116787 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116792 4756 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116797 4756 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116801 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116806 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116809 4756 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116813 4756 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116818 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116822 4756 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116827 4756 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116832 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116837 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116842 4756 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116846 4756 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116850 4756 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116855 4756 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116859 4756 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116864 4756 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116868 4756 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116875 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116880 4756 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116885 4756 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116890 4756 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116894 4756 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116901 4756 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116907 4756 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116912 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116918 4756 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116923 4756 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116929 4756 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116934 4756 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116938 4756 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116943 4756 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116949 4756 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116953 4756 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116957 4756 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116962 4756 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116966 4756 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116971 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116975 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116980 4756 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116984 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116989 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116993 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.116997 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117002 4756 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117007 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117011 4756 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117027 4756 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117031 4756 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117036 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117041 4756 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.117049 4756 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117230 4756 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117241 4756 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117246 4756 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117251 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117256 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117261 4756 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117266 4756 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117271 4756 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117275 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117281 4756 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117288 4756 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117293 4756 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117300 4756 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117305 4756 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117310 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117314 4756 feature_gate.go:330] unrecognized feature gate: Example Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117318 4756 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117324 4756 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117330 4756 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117334 4756 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117339 4756 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117344 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117348 4756 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117353 4756 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117358 4756 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117362 4756 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117367 4756 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117371 4756 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117374 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117378 4756 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117382 4756 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117386 4756 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117389 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117393 4756 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117397 4756 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117400 4756 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117404 4756 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117407 4756 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117410 4756 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117414 4756 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117417 4756 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117420 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117424 4756 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117427 4756 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117431 4756 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117435 4756 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117439 4756 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117442 4756 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117446 4756 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117449 4756 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117453 4756 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117457 4756 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117460 4756 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117464 4756 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117467 4756 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117472 4756 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117476 4756 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117480 4756 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117483 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117487 4756 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117491 4756 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117494 4756 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117499 4756 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117503 4756 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117506 4756 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117510 4756 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117514 4756 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117517 4756 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117520 4756 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117524 4756 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.117529 4756 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.117535 4756 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.118624 4756 server.go:940] "Client rotation is on, will bootstrap in background" Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.124546 4756 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.128564 4756 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.128724 4756 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.131291 4756 server.go:997] "Starting client certificate rotation" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.131329 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.131473 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.158595 4756 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.160937 4756 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.34:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.161889 4756 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.182742 4756 log.go:25] "Validated CRI v1 runtime API" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.220528 4756 log.go:25] "Validated CRI v1 image API" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.222874 4756 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.226907 4756 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-18-13-55-39-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.226933 4756 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.243146 4756 manager.go:217] Machine: {Timestamp:2026-03-18 13:59:59.241144613 +0000 UTC m=+0.555562598 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2 BootID:aa49f241-7e2e-4961-9f17-9d946c9cd47b Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ae:66:74 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ae:66:74 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:0d:76:3f Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a3:d6:f8 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:db:4c:f9 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:86:c3:54 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ee:5f:c8:26:30:e6 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:4e:9a:5e:2d:7b:3e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.243354 4756 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.243472 4756 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.246379 4756 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.246687 4756 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.246732 4756 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.246988 4756 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.246998 4756 container_manager_linux.go:303] "Creating device plugin manager" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.247497 4756 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.247535 4756 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.247795 4756 state_mem.go:36] "Initialized new in-memory state store" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.247896 4756 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.251092 4756 kubelet.go:418] "Attempting to sync node with API server" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.251136 4756 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.251174 4756 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.251191 4756 kubelet.go:324] "Adding apiserver pod source" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.251205 4756 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.254326 4756 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.255252 4756 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.256202 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.34:6443: connect: connection refused Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.256296 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.34:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.256506 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.34:6443: connect: connection refused Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.256551 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.34:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.257329 4756 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.258599 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.258623 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.258631 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.258643 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.258656 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.258669 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.258681 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.258693 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.258701 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.258708 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.258732 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.258741 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.258765 4756 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.259206 4756 server.go:1280] "Started kubelet" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.259365 4756 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.259529 4756 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.259956 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.34:6443: connect: connection refused Mar 18 13:59:59 crc systemd[1]: Started Kubernetes Kubelet. Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.261034 4756 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.262076 4756 server.go:460] "Adding debug handlers to kubelet server" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.263072 4756 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.263104 4756 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.263475 4756 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.263532 4756 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.263539 4756 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.263597 4756 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.263971 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.34:6443: connect: connection refused Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.264059 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.34:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.264516 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" interval="200ms" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.266011 4756 factory.go:55] Registering systemd factory Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.266037 4756 factory.go:221] Registration of the systemd container factory successfully Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.266493 4756 factory.go:153] Registering CRI-O factory Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.271307 4756 factory.go:221] Registration of the crio container factory successfully Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.271378 4756 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.271410 4756 factory.go:103] Registering Raw factory Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.271460 4756 manager.go:1196] Started watching for new ooms in manager Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.274144 4756 manager.go:319] Starting recovery of all containers Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.275171 4756 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.34:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189df4438f5ea59a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.259174298 +0000 UTC m=+0.573592263,LastTimestamp:2026-03-18 13:59:59.259174298 +0000 UTC m=+0.573592263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284456 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284539 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284557 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284576 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284591 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284607 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284621 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284636 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284655 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284669 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284684 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284697 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284710 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284728 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284741 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284772 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284788 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284803 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284817 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284832 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284846 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284861 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284875 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284890 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284905 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284920 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284938 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284955 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284969 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284984 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.284999 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.285017 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.285032 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.285048 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.285089 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.285101 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.285131 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.285148 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.285163 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289065 4756 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289142 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289160 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289177 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289190 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289203 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289216 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289230 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289241 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289255 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289271 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289283 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289295 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289307 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289325 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289340 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289355 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289371 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289384 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289395 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289406 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289420 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289432 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289461 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289474 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289489 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289520 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289537 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289564 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289589 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289610 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289620 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289643 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289664 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289676 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289689 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289709 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289738 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289763 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289776 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289796 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289823 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289857 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289873 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289917 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289929 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289943 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.289954 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.292932 4756 manager.go:324] Recovery completed Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.291229 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.292993 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293008 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293019 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293030 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293044 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293057 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293067 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293078 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293089 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293100 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293112 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293136 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293148 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293158 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293169 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293180 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293191 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293209 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293221 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293232 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293244 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293257 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293270 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293282 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293295 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293312 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293331 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293347 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293358 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293370 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293380 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293390 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293401 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293413 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293433 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293444 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293455 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293468 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293479 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293489 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293504 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293515 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293526 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293538 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293548 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293568 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293583 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293593 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293605 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293616 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293628 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293638 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293649 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293659 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293670 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293681 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293693 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293712 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293727 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293741 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293752 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293763 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293774 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293783 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293800 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293810 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293821 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293832 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293843 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293853 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293864 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293875 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293888 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293898 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293909 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293924 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293939 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293950 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293961 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293972 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293983 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.293994 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294006 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294021 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294032 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294042 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294053 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294064 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294075 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294086 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294098 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294108 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294181 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294193 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294208 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294218 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294228 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294238 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294249 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294258 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294268 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294279 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294290 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294302 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294326 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294338 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294349 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294359 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294368 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294378 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294391 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294400 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294413 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294423 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294433 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294444 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294457 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294468 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294478 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294487 4756 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294503 4756 reconstruct.go:97] "Volume reconstruction finished" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.294511 4756 reconciler.go:26] "Reconciler: start to sync state" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.305369 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.307681 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.307727 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.307747 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.309261 4756 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.309283 4756 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.309303 4756 state_mem.go:36] "Initialized new in-memory state store" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.312558 4756 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.314092 4756 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.314155 4756 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.314187 4756 kubelet.go:2335] "Starting kubelet main sync loop" Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.314240 4756 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.314747 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.34:6443: connect: connection refused Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.314806 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.34:6443: connect: connection refused" logger="UnhandledError" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.331401 4756 policy_none.go:49] "None policy: Start" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.333086 4756 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.333141 4756 state_mem.go:35] "Initializing new in-memory state store" Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.363903 4756 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.381019 4756 manager.go:334] "Starting Device Plugin manager" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.381075 4756 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.381085 4756 server.go:79] "Starting device plugin registration server" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.381467 4756 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.381481 4756 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.381633 4756 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.381703 4756 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.381713 4756 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.389498 4756 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.414693 4756 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.414783 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.415535 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.415587 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.415598 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.415712 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.415950 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.416000 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.416404 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.416430 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.416439 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.416515 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.416701 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.416762 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.416751 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.416804 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.416814 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.417167 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.417189 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.417216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.417346 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.417473 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.417505 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.417698 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.417736 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.417752 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.418205 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.418228 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.418237 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.418316 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.418352 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.418367 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.418375 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.418431 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.418462 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.418919 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.418945 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.418958 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.419147 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.419180 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.419193 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.419246 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.419278 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.419988 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.420007 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.420018 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.465723 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" interval="400ms" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.481773 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.483163 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.483197 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.483207 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.483230 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.483675 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.34:6443: connect: connection refused" node="crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.496304 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.496364 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.496415 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.496460 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.496503 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.496543 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.496584 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.496658 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.496711 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.496762 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.496821 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.497034 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.497087 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.497154 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.497528 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599023 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599105 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599204 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599249 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599279 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599330 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599294 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599371 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599419 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599434 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599462 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599507 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599485 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599566 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599551 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599610 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599530 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599659 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599739 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599793 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599829 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599893 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599923 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599979 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.600004 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.600034 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.600049 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599948 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.599981 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.600321 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.684629 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.685898 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.685955 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.685972 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.686004 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.686538 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.34:6443: connect: connection refused" node="crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.768281 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.776876 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.796279 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.813582 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.815865 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-c86463e59bb3eeb2fa1cc5bc67f64d53dd06192bd3a3deec0ca7f7d0517d8988 WatchSource:0}: Error finding container c86463e59bb3eeb2fa1cc5bc67f64d53dd06192bd3a3deec0ca7f7d0517d8988: Status 404 returned error can't find the container with id c86463e59bb3eeb2fa1cc5bc67f64d53dd06192bd3a3deec0ca7f7d0517d8988 Mar 18 13:59:59 crc kubenswrapper[4756]: I0318 13:59:59.817792 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.817976 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-27a917086839f90b32daa45506d221b5d666ab30098fa67fd737e59f8da0c664 WatchSource:0}: Error finding container 27a917086839f90b32daa45506d221b5d666ab30098fa67fd737e59f8da0c664: Status 404 returned error can't find the container with id 27a917086839f90b32daa45506d221b5d666ab30098fa67fd737e59f8da0c664 Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.826383 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0840f7d905d3916b704397ec7f7acfe169ea4e9ca071379957ea6557b0ea7e4b WatchSource:0}: Error finding container 0840f7d905d3916b704397ec7f7acfe169ea4e9ca071379957ea6557b0ea7e4b: Status 404 returned error can't find the container with id 0840f7d905d3916b704397ec7f7acfe169ea4e9ca071379957ea6557b0ea7e4b Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.835579 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-3decadb90befc757accfc7705b51b125804a56d8ec51e6cc616f00789ccdf4ec WatchSource:0}: Error finding container 3decadb90befc757accfc7705b51b125804a56d8ec51e6cc616f00789ccdf4ec: Status 404 returned error can't find the container with id 3decadb90befc757accfc7705b51b125804a56d8ec51e6cc616f00789ccdf4ec Mar 18 13:59:59 crc kubenswrapper[4756]: W0318 13:59:59.840690 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-41959ea45d10c327dd96e48d115f9fadeb658e3638a7776e8ac887d7d58a600b WatchSource:0}: Error finding container 41959ea45d10c327dd96e48d115f9fadeb658e3638a7776e8ac887d7d58a600b: Status 404 returned error can't find the container with id 41959ea45d10c327dd96e48d115f9fadeb658e3638a7776e8ac887d7d58a600b Mar 18 13:59:59 crc kubenswrapper[4756]: E0318 13:59:59.867398 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" interval="800ms" Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.087656 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.090884 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.090934 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.090950 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.090980 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 14:00:00 crc kubenswrapper[4756]: E0318 14:00:00.091404 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.34:6443: connect: connection refused" node="crc" Mar 18 14:00:00 crc kubenswrapper[4756]: W0318 14:00:00.113847 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.34:6443: connect: connection refused Mar 18 14:00:00 crc kubenswrapper[4756]: E0318 14:00:00.113978 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.34:6443: connect: connection refused" logger="UnhandledError" Mar 18 14:00:00 crc kubenswrapper[4756]: W0318 14:00:00.177816 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.34:6443: connect: connection refused Mar 18 14:00:00 crc kubenswrapper[4756]: E0318 14:00:00.177893 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.34:6443: connect: connection refused" logger="UnhandledError" Mar 18 14:00:00 crc kubenswrapper[4756]: W0318 14:00:00.222303 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.34:6443: connect: connection refused Mar 18 14:00:00 crc kubenswrapper[4756]: E0318 14:00:00.222404 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.34:6443: connect: connection refused" logger="UnhandledError" Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.261909 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.34:6443: connect: connection refused Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.318185 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0840f7d905d3916b704397ec7f7acfe169ea4e9ca071379957ea6557b0ea7e4b"} Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.319198 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"27a917086839f90b32daa45506d221b5d666ab30098fa67fd737e59f8da0c664"} Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.320449 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c86463e59bb3eeb2fa1cc5bc67f64d53dd06192bd3a3deec0ca7f7d0517d8988"} Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.321444 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"41959ea45d10c327dd96e48d115f9fadeb658e3638a7776e8ac887d7d58a600b"} Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.322383 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3decadb90befc757accfc7705b51b125804a56d8ec51e6cc616f00789ccdf4ec"} Mar 18 14:00:00 crc kubenswrapper[4756]: E0318 14:00:00.668195 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" interval="1.6s" Mar 18 14:00:00 crc kubenswrapper[4756]: W0318 14:00:00.845923 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.34:6443: connect: connection refused Mar 18 14:00:00 crc kubenswrapper[4756]: E0318 14:00:00.846000 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.34:6443: connect: connection refused" logger="UnhandledError" Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.891633 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.893058 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.893100 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.893112 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:00 crc kubenswrapper[4756]: I0318 14:00:00.893145 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 14:00:00 crc kubenswrapper[4756]: E0318 14:00:00.893514 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.34:6443: connect: connection refused" node="crc" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.261721 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.34:6443: connect: connection refused Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.261766 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 14:00:01 crc kubenswrapper[4756]: E0318 14:00:01.262996 4756 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.34:6443: connect: connection refused" logger="UnhandledError" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.327508 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.327496 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94"} Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.327918 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446"} Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.327971 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76"} Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.327982 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c"} Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.329046 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2" exitCode=0 Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.329166 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2"} Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.329268 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.329370 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.329483 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.329582 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.330285 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.330317 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.330328 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.332595 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.332783 4756 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94" exitCode=0 Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.332847 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94"} Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.332934 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.333445 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.333482 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.333496 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.333889 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.333911 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.333922 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.335299 4756 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713" exitCode=0 Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.335359 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.335408 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713"} Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.336612 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.336656 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.336670 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.337652 4756 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471" exitCode=0 Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.337693 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471"} Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.337776 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.338667 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.338701 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:01 crc kubenswrapper[4756]: I0318 14:00:01.338713 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:01 crc kubenswrapper[4756]: W0318 14:00:01.801407 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.34:6443: connect: connection refused Mar 18 14:00:01 crc kubenswrapper[4756]: E0318 14:00:01.801488 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.34:6443: connect: connection refused" logger="UnhandledError" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.261053 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.34:6443: connect: connection refused Mar 18 14:00:02 crc kubenswrapper[4756]: E0318 14:00:02.269006 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" interval="3.2s" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.341594 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"92f5c994aeb269fd08691355c96c16c3f714e3dca2d9b8329015e9d32d4e1ad1"} Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.341645 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7d9ab2bc4932c031750af26b1bee4bba55196c4a3f57252597753681ae34d37a"} Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.341659 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"40a7ae4a678ad496f072b4a15ec21271439efe888a3dcaeab650918b804f3df4"} Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.341763 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.342728 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.342756 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.342769 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.345475 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ac262a2299741db6528266da300b1255851399275365cd9c84b5f7b5f82f6d2b"} Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.345503 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b"} Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.345513 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34"} Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.345521 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1"} Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.345530 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80"} Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.345609 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.346336 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.346358 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.346366 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.347938 4756 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5" exitCode=0 Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.347992 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5"} Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.348010 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.348726 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.348762 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.348771 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.350150 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.350180 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.350692 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada"} Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.350847 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.350878 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.350890 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.351176 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.351193 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.351200 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.493599 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.494812 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.494842 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.494852 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.494876 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 14:00:02 crc kubenswrapper[4756]: E0318 14:00:02.495395 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.34:6443: connect: connection refused" node="crc" Mar 18 14:00:02 crc kubenswrapper[4756]: I0318 14:00:02.932072 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.034889 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.354936 4756 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a" exitCode=0 Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.355041 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.355074 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.355103 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.355351 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.355396 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a"} Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.355415 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.355376 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.355620 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.356484 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.356505 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.356516 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.356592 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.356635 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.356658 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.357201 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.357228 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.357239 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.357889 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.357890 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.357947 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.357963 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.357917 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.358056 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:03 crc kubenswrapper[4756]: I0318 14:00:03.935236 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 14:00:04 crc kubenswrapper[4756]: I0318 14:00:04.362778 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac"} Mar 18 14:00:04 crc kubenswrapper[4756]: I0318 14:00:04.362820 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913"} Mar 18 14:00:04 crc kubenswrapper[4756]: I0318 14:00:04.362831 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501"} Mar 18 14:00:04 crc kubenswrapper[4756]: I0318 14:00:04.362842 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b"} Mar 18 14:00:04 crc kubenswrapper[4756]: I0318 14:00:04.362867 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:04 crc kubenswrapper[4756]: I0318 14:00:04.362939 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:04 crc kubenswrapper[4756]: I0318 14:00:04.363976 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:04 crc kubenswrapper[4756]: I0318 14:00:04.364012 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:04 crc kubenswrapper[4756]: I0318 14:00:04.364025 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:04 crc kubenswrapper[4756]: I0318 14:00:04.364266 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:04 crc kubenswrapper[4756]: I0318 14:00:04.364326 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:04 crc kubenswrapper[4756]: I0318 14:00:04.364350 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:04 crc kubenswrapper[4756]: I0318 14:00:04.796088 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.093475 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.102626 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.320206 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.371695 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d"} Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.371827 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.371840 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.373206 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.373266 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.373289 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.373311 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.373342 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.373362 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.503858 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.504067 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.504168 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.505733 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.505795 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.505819 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.617068 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.696055 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.698363 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.698445 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.698469 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:05 crc kubenswrapper[4756]: I0318 14:00:05.698521 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 14:00:06 crc kubenswrapper[4756]: I0318 14:00:06.374033 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:06 crc kubenswrapper[4756]: I0318 14:00:06.374086 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 14:00:06 crc kubenswrapper[4756]: I0318 14:00:06.374168 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:06 crc kubenswrapper[4756]: I0318 14:00:06.374196 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:06 crc kubenswrapper[4756]: I0318 14:00:06.376055 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:06 crc kubenswrapper[4756]: I0318 14:00:06.376162 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:06 crc kubenswrapper[4756]: I0318 14:00:06.376064 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:06 crc kubenswrapper[4756]: I0318 14:00:06.376188 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:06 crc kubenswrapper[4756]: I0318 14:00:06.376210 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:06 crc kubenswrapper[4756]: I0318 14:00:06.376235 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:06 crc kubenswrapper[4756]: I0318 14:00:06.376305 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:06 crc kubenswrapper[4756]: I0318 14:00:06.376338 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:06 crc kubenswrapper[4756]: I0318 14:00:06.376353 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:07 crc kubenswrapper[4756]: I0318 14:00:07.504523 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 18 14:00:07 crc kubenswrapper[4756]: I0318 14:00:07.504769 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:07 crc kubenswrapper[4756]: I0318 14:00:07.506653 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:07 crc kubenswrapper[4756]: I0318 14:00:07.506715 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:07 crc kubenswrapper[4756]: I0318 14:00:07.506738 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:07 crc kubenswrapper[4756]: I0318 14:00:07.796953 4756 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:00:07 crc kubenswrapper[4756]: I0318 14:00:07.797070 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:00:09 crc kubenswrapper[4756]: I0318 14:00:09.175258 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:00:09 crc kubenswrapper[4756]: I0318 14:00:09.175567 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:09 crc kubenswrapper[4756]: I0318 14:00:09.177241 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:09 crc kubenswrapper[4756]: I0318 14:00:09.177331 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:09 crc kubenswrapper[4756]: I0318 14:00:09.177359 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:09 crc kubenswrapper[4756]: E0318 14:00:09.389616 4756 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 14:00:10 crc kubenswrapper[4756]: I0318 14:00:10.039903 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 18 14:00:10 crc kubenswrapper[4756]: I0318 14:00:10.040191 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:10 crc kubenswrapper[4756]: I0318 14:00:10.041578 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:10 crc kubenswrapper[4756]: I0318 14:00:10.041660 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:10 crc kubenswrapper[4756]: I0318 14:00:10.041699 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:13 crc kubenswrapper[4756]: I0318 14:00:12.938557 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:00:13 crc kubenswrapper[4756]: I0318 14:00:12.938707 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:13 crc kubenswrapper[4756]: I0318 14:00:12.940085 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:13 crc kubenswrapper[4756]: I0318 14:00:12.940141 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:13 crc kubenswrapper[4756]: I0318 14:00:12.940152 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:13 crc kubenswrapper[4756]: W0318 14:00:12.972874 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 14:00:13 crc kubenswrapper[4756]: I0318 14:00:12.972986 4756 trace.go:236] Trace[1573795396]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 14:00:02.972) (total time: 10000ms): Mar 18 14:00:13 crc kubenswrapper[4756]: Trace[1573795396]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (14:00:12.972) Mar 18 14:00:13 crc kubenswrapper[4756]: Trace[1573795396]: [10.000835493s] [10.000835493s] END Mar 18 14:00:13 crc kubenswrapper[4756]: E0318 14:00:12.973005 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 14:00:13 crc kubenswrapper[4756]: W0318 14:00:13.211285 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 14:00:13 crc kubenswrapper[4756]: I0318 14:00:13.211400 4756 trace.go:236] Trace[132834015]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 14:00:03.209) (total time: 10001ms): Mar 18 14:00:13 crc kubenswrapper[4756]: Trace[132834015]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:00:13.211) Mar 18 14:00:13 crc kubenswrapper[4756]: Trace[132834015]: [10.001541762s] [10.001541762s] END Mar 18 14:00:13 crc kubenswrapper[4756]: E0318 14:00:13.211429 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 14:00:13 crc kubenswrapper[4756]: I0318 14:00:13.261932 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 18 14:00:13 crc kubenswrapper[4756]: W0318 14:00:13.398883 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 14:00:13 crc kubenswrapper[4756]: I0318 14:00:13.399049 4756 trace.go:236] Trace[2044626382]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 14:00:03.397) (total time: 10001ms): Mar 18 14:00:13 crc kubenswrapper[4756]: Trace[2044626382]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:00:13.398) Mar 18 14:00:13 crc kubenswrapper[4756]: Trace[2044626382]: [10.001391132s] [10.001391132s] END Mar 18 14:00:13 crc kubenswrapper[4756]: E0318 14:00:13.399094 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 14:00:13 crc kubenswrapper[4756]: E0318 14:00:13.682063 4756 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 14:00:13 crc kubenswrapper[4756]: E0318 14:00:13.684272 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:13Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 18 14:00:13 crc kubenswrapper[4756]: E0318 14:00:13.684996 4756 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:13Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df4438f5ea59a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.259174298 +0000 UTC m=+0.573592263,LastTimestamp:2026-03-18 13:59:59.259174298 +0000 UTC m=+0.573592263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:13 crc kubenswrapper[4756]: W0318 14:00:13.686840 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:13Z is after 2026-02-23T05:33:13Z Mar 18 14:00:13 crc kubenswrapper[4756]: E0318 14:00:13.687044 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 14:00:13 crc kubenswrapper[4756]: E0318 14:00:13.688199 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:13Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 14:00:13 crc kubenswrapper[4756]: I0318 14:00:13.695205 4756 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 14:00:13 crc kubenswrapper[4756]: I0318 14:00:13.695280 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 14:00:13 crc kubenswrapper[4756]: I0318 14:00:13.702513 4756 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 14:00:13 crc kubenswrapper[4756]: I0318 14:00:13.702581 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 14:00:14 crc kubenswrapper[4756]: I0318 14:00:14.262868 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:14Z is after 2026-02-23T05:33:13Z Mar 18 14:00:14 crc kubenswrapper[4756]: I0318 14:00:14.401819 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 14:00:14 crc kubenswrapper[4756]: I0318 14:00:14.405921 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ac262a2299741db6528266da300b1255851399275365cd9c84b5f7b5f82f6d2b" exitCode=255 Mar 18 14:00:14 crc kubenswrapper[4756]: I0318 14:00:14.406049 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ac262a2299741db6528266da300b1255851399275365cd9c84b5f7b5f82f6d2b"} Mar 18 14:00:14 crc kubenswrapper[4756]: I0318 14:00:14.407064 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:14 crc kubenswrapper[4756]: I0318 14:00:14.408707 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:14 crc kubenswrapper[4756]: I0318 14:00:14.408773 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:14 crc kubenswrapper[4756]: I0318 14:00:14.408797 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:14 crc kubenswrapper[4756]: I0318 14:00:14.409835 4756 scope.go:117] "RemoveContainer" containerID="ac262a2299741db6528266da300b1255851399275365cd9c84b5f7b5f82f6d2b" Mar 18 14:00:15 crc kubenswrapper[4756]: I0318 14:00:15.266560 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:15Z is after 2026-02-23T05:33:13Z Mar 18 14:00:15 crc kubenswrapper[4756]: I0318 14:00:15.412031 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 14:00:15 crc kubenswrapper[4756]: I0318 14:00:15.415536 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"619995bfe098b80f9390d11eb50e3dbc3e27e8305cdfdd0de28de64f9e3c7049"} Mar 18 14:00:15 crc kubenswrapper[4756]: I0318 14:00:15.415813 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:15 crc kubenswrapper[4756]: I0318 14:00:15.417651 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:15 crc kubenswrapper[4756]: I0318 14:00:15.417725 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:15 crc kubenswrapper[4756]: I0318 14:00:15.417762 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:15 crc kubenswrapper[4756]: I0318 14:00:15.625572 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:00:16 crc kubenswrapper[4756]: I0318 14:00:16.265263 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:16Z is after 2026-02-23T05:33:13Z Mar 18 14:00:16 crc kubenswrapper[4756]: I0318 14:00:16.421651 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 14:00:16 crc kubenswrapper[4756]: I0318 14:00:16.422990 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 14:00:16 crc kubenswrapper[4756]: I0318 14:00:16.426595 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="619995bfe098b80f9390d11eb50e3dbc3e27e8305cdfdd0de28de64f9e3c7049" exitCode=255 Mar 18 14:00:16 crc kubenswrapper[4756]: I0318 14:00:16.426667 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"619995bfe098b80f9390d11eb50e3dbc3e27e8305cdfdd0de28de64f9e3c7049"} Mar 18 14:00:16 crc kubenswrapper[4756]: I0318 14:00:16.426757 4756 scope.go:117] "RemoveContainer" containerID="ac262a2299741db6528266da300b1255851399275365cd9c84b5f7b5f82f6d2b" Mar 18 14:00:16 crc kubenswrapper[4756]: I0318 14:00:16.426793 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:16 crc kubenswrapper[4756]: I0318 14:00:16.429200 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:16 crc kubenswrapper[4756]: I0318 14:00:16.429365 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:16 crc kubenswrapper[4756]: I0318 14:00:16.429392 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:16 crc kubenswrapper[4756]: I0318 14:00:16.430622 4756 scope.go:117] "RemoveContainer" containerID="619995bfe098b80f9390d11eb50e3dbc3e27e8305cdfdd0de28de64f9e3c7049" Mar 18 14:00:16 crc kubenswrapper[4756]: E0318 14:00:16.431106 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:00:16 crc kubenswrapper[4756]: I0318 14:00:16.434642 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:00:16 crc kubenswrapper[4756]: W0318 14:00:16.662552 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:16Z is after 2026-02-23T05:33:13Z Mar 18 14:00:16 crc kubenswrapper[4756]: E0318 14:00:16.662684 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 14:00:17 crc kubenswrapper[4756]: W0318 14:00:17.093612 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:17Z is after 2026-02-23T05:33:13Z Mar 18 14:00:17 crc kubenswrapper[4756]: E0318 14:00:17.093730 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.265336 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:17Z is after 2026-02-23T05:33:13Z Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.433087 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.436696 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.437974 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.438028 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.438045 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.438889 4756 scope.go:117] "RemoveContainer" containerID="619995bfe098b80f9390d11eb50e3dbc3e27e8305cdfdd0de28de64f9e3c7049" Mar 18 14:00:17 crc kubenswrapper[4756]: E0318 14:00:17.439220 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.538100 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.538432 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.539900 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.540212 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.540244 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.555245 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.797105 4756 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:00:17 crc kubenswrapper[4756]: I0318 14:00:17.797244 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:00:17 crc kubenswrapper[4756]: W0318 14:00:17.944073 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:17Z is after 2026-02-23T05:33:13Z Mar 18 14:00:17 crc kubenswrapper[4756]: E0318 14:00:17.944214 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 14:00:18 crc kubenswrapper[4756]: I0318 14:00:18.265436 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:18Z is after 2026-02-23T05:33:13Z Mar 18 14:00:18 crc kubenswrapper[4756]: I0318 14:00:18.439496 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:18 crc kubenswrapper[4756]: I0318 14:00:18.439662 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:18 crc kubenswrapper[4756]: I0318 14:00:18.441182 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:18 crc kubenswrapper[4756]: I0318 14:00:18.441214 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:18 crc kubenswrapper[4756]: I0318 14:00:18.441246 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:18 crc kubenswrapper[4756]: I0318 14:00:18.441259 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:18 crc kubenswrapper[4756]: I0318 14:00:18.441263 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:18 crc kubenswrapper[4756]: I0318 14:00:18.441284 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:18 crc kubenswrapper[4756]: I0318 14:00:18.441924 4756 scope.go:117] "RemoveContainer" containerID="619995bfe098b80f9390d11eb50e3dbc3e27e8305cdfdd0de28de64f9e3c7049" Mar 18 14:00:18 crc kubenswrapper[4756]: E0318 14:00:18.442170 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:00:19 crc kubenswrapper[4756]: I0318 14:00:19.175456 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:00:19 crc kubenswrapper[4756]: I0318 14:00:19.267333 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:19Z is after 2026-02-23T05:33:13Z Mar 18 14:00:19 crc kubenswrapper[4756]: E0318 14:00:19.389800 4756 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 14:00:19 crc kubenswrapper[4756]: I0318 14:00:19.442365 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:19 crc kubenswrapper[4756]: I0318 14:00:19.443989 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:19 crc kubenswrapper[4756]: I0318 14:00:19.444050 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:19 crc kubenswrapper[4756]: I0318 14:00:19.444076 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:19 crc kubenswrapper[4756]: I0318 14:00:19.444965 4756 scope.go:117] "RemoveContainer" containerID="619995bfe098b80f9390d11eb50e3dbc3e27e8305cdfdd0de28de64f9e3c7049" Mar 18 14:00:19 crc kubenswrapper[4756]: E0318 14:00:19.445273 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:00:20 crc kubenswrapper[4756]: E0318 14:00:20.088357 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:20Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 14:00:20 crc kubenswrapper[4756]: I0318 14:00:20.088624 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:20 crc kubenswrapper[4756]: I0318 14:00:20.090052 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:20 crc kubenswrapper[4756]: I0318 14:00:20.090084 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:20 crc kubenswrapper[4756]: I0318 14:00:20.090093 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:20 crc kubenswrapper[4756]: I0318 14:00:20.090130 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 14:00:20 crc kubenswrapper[4756]: E0318 14:00:20.093892 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:20Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 14:00:20 crc kubenswrapper[4756]: I0318 14:00:20.266189 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:20Z is after 2026-02-23T05:33:13Z Mar 18 14:00:20 crc kubenswrapper[4756]: W0318 14:00:20.519868 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:20Z is after 2026-02-23T05:33:13Z Mar 18 14:00:20 crc kubenswrapper[4756]: E0318 14:00:20.519978 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 14:00:21 crc kubenswrapper[4756]: I0318 14:00:21.254036 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:00:21 crc kubenswrapper[4756]: I0318 14:00:21.254290 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:21 crc kubenswrapper[4756]: I0318 14:00:21.255740 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:21 crc kubenswrapper[4756]: I0318 14:00:21.255793 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:21 crc kubenswrapper[4756]: I0318 14:00:21.255811 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:21 crc kubenswrapper[4756]: I0318 14:00:21.256628 4756 scope.go:117] "RemoveContainer" containerID="619995bfe098b80f9390d11eb50e3dbc3e27e8305cdfdd0de28de64f9e3c7049" Mar 18 14:00:21 crc kubenswrapper[4756]: E0318 14:00:21.256893 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:00:21 crc kubenswrapper[4756]: I0318 14:00:21.265827 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:21Z is after 2026-02-23T05:33:13Z Mar 18 14:00:21 crc kubenswrapper[4756]: I0318 14:00:21.824973 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 14:00:21 crc kubenswrapper[4756]: E0318 14:00:21.829385 4756 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 14:00:22 crc kubenswrapper[4756]: I0318 14:00:22.265586 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:22Z is after 2026-02-23T05:33:13Z Mar 18 14:00:23 crc kubenswrapper[4756]: I0318 14:00:23.265978 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:23Z is after 2026-02-23T05:33:13Z Mar 18 14:00:23 crc kubenswrapper[4756]: W0318 14:00:23.635890 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:23Z is after 2026-02-23T05:33:13Z Mar 18 14:00:23 crc kubenswrapper[4756]: E0318 14:00:23.635993 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 14:00:23 crc kubenswrapper[4756]: E0318 14:00:23.691380 4756 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:23Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df4438f5ea59a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.259174298 +0000 UTC m=+0.573592263,LastTimestamp:2026-03-18 13:59:59.259174298 +0000 UTC m=+0.573592263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:24 crc kubenswrapper[4756]: I0318 14:00:24.266432 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:24Z is after 2026-02-23T05:33:13Z Mar 18 14:00:25 crc kubenswrapper[4756]: I0318 14:00:25.265943 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:25Z is after 2026-02-23T05:33:13Z Mar 18 14:00:25 crc kubenswrapper[4756]: W0318 14:00:25.470729 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:25Z is after 2026-02-23T05:33:13Z Mar 18 14:00:25 crc kubenswrapper[4756]: E0318 14:00:25.470835 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:25Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 14:00:26 crc kubenswrapper[4756]: I0318 14:00:26.264937 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:26Z is after 2026-02-23T05:33:13Z Mar 18 14:00:27 crc kubenswrapper[4756]: E0318 14:00:27.092796 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.094837 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.096597 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.096684 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.096701 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.096733 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 14:00:27 crc kubenswrapper[4756]: E0318 14:00:27.099784 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.265597 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z Mar 18 14:00:27 crc kubenswrapper[4756]: W0318 14:00:27.790174 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z Mar 18 14:00:27 crc kubenswrapper[4756]: E0318 14:00:27.790285 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.796675 4756 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.796742 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.796810 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.797022 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.798577 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.798645 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.798662 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.799318 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 14:00:27 crc kubenswrapper[4756]: I0318 14:00:27.799569 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76" gracePeriod=30 Mar 18 14:00:28 crc kubenswrapper[4756]: I0318 14:00:28.266010 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:28Z is after 2026-02-23T05:33:13Z Mar 18 14:00:28 crc kubenswrapper[4756]: I0318 14:00:28.471960 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 14:00:28 crc kubenswrapper[4756]: I0318 14:00:28.472559 4756 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76" exitCode=255 Mar 18 14:00:28 crc kubenswrapper[4756]: I0318 14:00:28.472633 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76"} Mar 18 14:00:28 crc kubenswrapper[4756]: I0318 14:00:28.472688 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5"} Mar 18 14:00:28 crc kubenswrapper[4756]: I0318 14:00:28.472856 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:28 crc kubenswrapper[4756]: I0318 14:00:28.474307 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:28 crc kubenswrapper[4756]: I0318 14:00:28.474378 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:28 crc kubenswrapper[4756]: I0318 14:00:28.474396 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:29 crc kubenswrapper[4756]: I0318 14:00:29.265670 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:29Z is after 2026-02-23T05:33:13Z Mar 18 14:00:29 crc kubenswrapper[4756]: E0318 14:00:29.390106 4756 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 14:00:30 crc kubenswrapper[4756]: I0318 14:00:30.265929 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:30Z is after 2026-02-23T05:33:13Z Mar 18 14:00:31 crc kubenswrapper[4756]: I0318 14:00:31.266431 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:31Z is after 2026-02-23T05:33:13Z Mar 18 14:00:32 crc kubenswrapper[4756]: I0318 14:00:32.267090 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:32Z is after 2026-02-23T05:33:13Z Mar 18 14:00:32 crc kubenswrapper[4756]: I0318 14:00:32.315036 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:32 crc kubenswrapper[4756]: I0318 14:00:32.316650 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:32 crc kubenswrapper[4756]: I0318 14:00:32.316718 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:32 crc kubenswrapper[4756]: I0318 14:00:32.316735 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:32 crc kubenswrapper[4756]: I0318 14:00:32.317576 4756 scope.go:117] "RemoveContainer" containerID="619995bfe098b80f9390d11eb50e3dbc3e27e8305cdfdd0de28de64f9e3c7049" Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.035528 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.035764 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.037254 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.037464 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.037693 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.266163 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:33Z is after 2026-02-23T05:33:13Z Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.488627 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.489306 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.491668 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b8806d786a8eb11f43617f498c78b7e1d20a8f0851548d2983f2bb37e7860906" exitCode=255 Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.491723 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b8806d786a8eb11f43617f498c78b7e1d20a8f0851548d2983f2bb37e7860906"} Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.491770 4756 scope.go:117] "RemoveContainer" containerID="619995bfe098b80f9390d11eb50e3dbc3e27e8305cdfdd0de28de64f9e3c7049" Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.491939 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.493309 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.493384 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.493411 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:33 crc kubenswrapper[4756]: I0318 14:00:33.494553 4756 scope.go:117] "RemoveContainer" containerID="b8806d786a8eb11f43617f498c78b7e1d20a8f0851548d2983f2bb37e7860906" Mar 18 14:00:33 crc kubenswrapper[4756]: E0318 14:00:33.494934 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:00:33 crc kubenswrapper[4756]: E0318 14:00:33.696953 4756 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:33Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189df4438f5ea59a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.259174298 +0000 UTC m=+0.573592263,LastTimestamp:2026-03-18 13:59:59.259174298 +0000 UTC m=+0.573592263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:34 crc kubenswrapper[4756]: E0318 14:00:34.096866 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:34Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 14:00:34 crc kubenswrapper[4756]: I0318 14:00:34.099902 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:34 crc kubenswrapper[4756]: I0318 14:00:34.102211 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:34 crc kubenswrapper[4756]: I0318 14:00:34.102620 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:34 crc kubenswrapper[4756]: I0318 14:00:34.102783 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:34 crc kubenswrapper[4756]: I0318 14:00:34.102998 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 14:00:34 crc kubenswrapper[4756]: E0318 14:00:34.106592 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:34Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 14:00:34 crc kubenswrapper[4756]: I0318 14:00:34.263634 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:34Z is after 2026-02-23T05:33:13Z Mar 18 14:00:34 crc kubenswrapper[4756]: I0318 14:00:34.497399 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 14:00:34 crc kubenswrapper[4756]: I0318 14:00:34.796707 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:00:34 crc kubenswrapper[4756]: I0318 14:00:34.796929 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:34 crc kubenswrapper[4756]: I0318 14:00:34.798516 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:34 crc kubenswrapper[4756]: I0318 14:00:34.798568 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:34 crc kubenswrapper[4756]: I0318 14:00:34.798587 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:34 crc kubenswrapper[4756]: W0318 14:00:34.879845 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:34Z is after 2026-02-23T05:33:13Z Mar 18 14:00:34 crc kubenswrapper[4756]: E0318 14:00:34.879995 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 14:00:35 crc kubenswrapper[4756]: I0318 14:00:35.264854 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:35Z is after 2026-02-23T05:33:13Z Mar 18 14:00:36 crc kubenswrapper[4756]: I0318 14:00:36.265706 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:36Z is after 2026-02-23T05:33:13Z Mar 18 14:00:37 crc kubenswrapper[4756]: I0318 14:00:37.263805 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:37Z is after 2026-02-23T05:33:13Z Mar 18 14:00:37 crc kubenswrapper[4756]: I0318 14:00:37.797953 4756 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:00:37 crc kubenswrapper[4756]: I0318 14:00:37.798084 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:00:38 crc kubenswrapper[4756]: I0318 14:00:38.265714 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:38Z is after 2026-02-23T05:33:13Z Mar 18 14:00:39 crc kubenswrapper[4756]: I0318 14:00:39.176212 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:00:39 crc kubenswrapper[4756]: I0318 14:00:39.176433 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:39 crc kubenswrapper[4756]: I0318 14:00:39.178046 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:39 crc kubenswrapper[4756]: I0318 14:00:39.178101 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:39 crc kubenswrapper[4756]: I0318 14:00:39.178151 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:39 crc kubenswrapper[4756]: I0318 14:00:39.178903 4756 scope.go:117] "RemoveContainer" containerID="b8806d786a8eb11f43617f498c78b7e1d20a8f0851548d2983f2bb37e7860906" Mar 18 14:00:39 crc kubenswrapper[4756]: E0318 14:00:39.179316 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:00:39 crc kubenswrapper[4756]: I0318 14:00:39.206652 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 14:00:39 crc kubenswrapper[4756]: I0318 14:00:39.226275 4756 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 14:00:39 crc kubenswrapper[4756]: I0318 14:00:39.267243 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:39 crc kubenswrapper[4756]: E0318 14:00:39.390214 4756 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 14:00:40 crc kubenswrapper[4756]: I0318 14:00:40.268382 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:41 crc kubenswrapper[4756]: E0318 14:00:41.101476 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 14:00:41 crc kubenswrapper[4756]: I0318 14:00:41.107586 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:41 crc kubenswrapper[4756]: I0318 14:00:41.108844 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:41 crc kubenswrapper[4756]: I0318 14:00:41.108906 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:41 crc kubenswrapper[4756]: I0318 14:00:41.108921 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:41 crc kubenswrapper[4756]: I0318 14:00:41.108984 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 14:00:41 crc kubenswrapper[4756]: E0318 14:00:41.113400 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 14:00:41 crc kubenswrapper[4756]: I0318 14:00:41.254306 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:00:41 crc kubenswrapper[4756]: I0318 14:00:41.254591 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:41 crc kubenswrapper[4756]: I0318 14:00:41.256249 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:41 crc kubenswrapper[4756]: I0318 14:00:41.256303 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:41 crc kubenswrapper[4756]: I0318 14:00:41.256320 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:41 crc kubenswrapper[4756]: I0318 14:00:41.257062 4756 scope.go:117] "RemoveContainer" containerID="b8806d786a8eb11f43617f498c78b7e1d20a8f0851548d2983f2bb37e7860906" Mar 18 14:00:41 crc kubenswrapper[4756]: E0318 14:00:41.257346 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:00:41 crc kubenswrapper[4756]: I0318 14:00:41.265162 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:42 crc kubenswrapper[4756]: I0318 14:00:42.266079 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:43 crc kubenswrapper[4756]: I0318 14:00:43.268772 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.705225 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4438f5ea59a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.259174298 +0000 UTC m=+0.573592263,LastTimestamp:2026-03-18 13:59:59.259174298 +0000 UTC m=+0.573592263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.712487 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df443924344c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307711686 +0000 UTC m=+0.622129661,LastTimestamp:2026-03-18 13:59:59.307711686 +0000 UTC m=+0.622129661,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.719671 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243afd9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307739097 +0000 UTC m=+0.622157092,LastTimestamp:2026-03-18 13:59:59.307739097 +0000 UTC m=+0.622157092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.726347 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243ed5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307754847 +0000 UTC m=+0.622172822,LastTimestamp:2026-03-18 13:59:59.307754847 +0000 UTC m=+0.622172822,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.732956 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df44396d8af06 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.384612614 +0000 UTC m=+0.699030589,LastTimestamp:2026-03-18 13:59:59.384612614 +0000 UTC m=+0.699030589,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.739619 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df443924344c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df443924344c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307711686 +0000 UTC m=+0.622129661,LastTimestamp:2026-03-18 13:59:59.415560655 +0000 UTC m=+0.729978630,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.741875 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df4439243afd9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243afd9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307739097 +0000 UTC m=+0.622157092,LastTimestamp:2026-03-18 13:59:59.415593785 +0000 UTC m=+0.730011760,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.746661 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df4439243ed5f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243ed5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307754847 +0000 UTC m=+0.622172822,LastTimestamp:2026-03-18 13:59:59.415602905 +0000 UTC m=+0.730020880,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.748666 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df443924344c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df443924344c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307711686 +0000 UTC m=+0.622129661,LastTimestamp:2026-03-18 13:59:59.416418497 +0000 UTC m=+0.730836472,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.754362 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df4439243afd9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243afd9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307739097 +0000 UTC m=+0.622157092,LastTimestamp:2026-03-18 13:59:59.416435827 +0000 UTC m=+0.730853792,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.761589 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df4439243ed5f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243ed5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307754847 +0000 UTC m=+0.622172822,LastTimestamp:2026-03-18 13:59:59.416443727 +0000 UTC m=+0.730861702,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: W0318 14:00:43.762057 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.762154 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.766796 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df443924344c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df443924344c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307711686 +0000 UTC m=+0.622129661,LastTimestamp:2026-03-18 13:59:59.416796488 +0000 UTC m=+0.731214463,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.773197 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df4439243afd9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243afd9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307739097 +0000 UTC m=+0.622157092,LastTimestamp:2026-03-18 13:59:59.416811528 +0000 UTC m=+0.731229503,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.780033 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df4439243ed5f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243ed5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307754847 +0000 UTC m=+0.622172822,LastTimestamp:2026-03-18 13:59:59.416819718 +0000 UTC m=+0.731237693,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.786568 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df443924344c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df443924344c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307711686 +0000 UTC m=+0.622129661,LastTimestamp:2026-03-18 13:59:59.417179849 +0000 UTC m=+0.731597824,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.792974 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df4439243afd9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243afd9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307739097 +0000 UTC m=+0.622157092,LastTimestamp:2026-03-18 13:59:59.417211569 +0000 UTC m=+0.731629544,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.799833 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df4439243ed5f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243ed5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307754847 +0000 UTC m=+0.622172822,LastTimestamp:2026-03-18 13:59:59.417221649 +0000 UTC m=+0.731639624,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.806582 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df443924344c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df443924344c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307711686 +0000 UTC m=+0.622129661,LastTimestamp:2026-03-18 13:59:59.41771913 +0000 UTC m=+0.732137115,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.810824 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df4439243afd9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243afd9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307739097 +0000 UTC m=+0.622157092,LastTimestamp:2026-03-18 13:59:59.41774531 +0000 UTC m=+0.732163295,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.816062 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df4439243ed5f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243ed5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307754847 +0000 UTC m=+0.622172822,LastTimestamp:2026-03-18 13:59:59.41775826 +0000 UTC m=+0.732176255,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.822937 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df443924344c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df443924344c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307711686 +0000 UTC m=+0.622129661,LastTimestamp:2026-03-18 13:59:59.41821964 +0000 UTC m=+0.732637615,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.828733 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df4439243afd9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243afd9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307739097 +0000 UTC m=+0.622157092,LastTimestamp:2026-03-18 13:59:59.41823496 +0000 UTC m=+0.732652925,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.833596 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df4439243ed5f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243ed5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307754847 +0000 UTC m=+0.622172822,LastTimestamp:2026-03-18 13:59:59.41824297 +0000 UTC m=+0.732660945,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.840628 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df443924344c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df443924344c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307711686 +0000 UTC m=+0.622129661,LastTimestamp:2026-03-18 13:59:59.418362711 +0000 UTC m=+0.732780686,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.846623 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189df4439243afd9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189df4439243afd9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.307739097 +0000 UTC m=+0.622157092,LastTimestamp:2026-03-18 13:59:59.418372321 +0000 UTC m=+0.732790296,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.854720 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df443b111d746 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.824566086 +0000 UTC m=+1.138984071,LastTimestamp:2026-03-18 13:59:59.824566086 +0000 UTC m=+1.138984071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.861518 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df443b11212a6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.824581286 +0000 UTC m=+1.138999281,LastTimestamp:2026-03-18 13:59:59.824581286 +0000 UTC m=+1.138999281,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.866821 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df443b18fbc3f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.832816703 +0000 UTC m=+1.147234688,LastTimestamp:2026-03-18 13:59:59.832816703 +0000 UTC m=+1.147234688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.871935 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443b22f41de openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.843271134 +0000 UTC m=+1.157689119,LastTimestamp:2026-03-18 13:59:59.843271134 +0000 UTC m=+1.157689119,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.878114 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df443b23c9565 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 13:59:59.844144485 +0000 UTC m=+1.158562470,LastTimestamp:2026-03-18 13:59:59.844144485 +0000 UTC m=+1.158562470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.883712 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443d4e7e81f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.425797663 +0000 UTC m=+1.740215628,LastTimestamp:2026-03-18 14:00:00.425797663 +0000 UTC m=+1.740215628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.890294 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df443d4fecd44 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.427298116 +0000 UTC m=+1.741716091,LastTimestamp:2026-03-18 14:00:00.427298116 +0000 UTC m=+1.741716091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.896817 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df443d50f832a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.428393258 +0000 UTC m=+1.742811243,LastTimestamp:2026-03-18 14:00:00.428393258 +0000 UTC m=+1.742811243,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.903529 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df443d5528af7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.432786167 +0000 UTC m=+1.747204142,LastTimestamp:2026-03-18 14:00:00.432786167 +0000 UTC m=+1.747204142,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.908524 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443d5639412 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.43390261 +0000 UTC m=+1.748320585,LastTimestamp:2026-03-18 14:00:00.43390261 +0000 UTC m=+1.748320585,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.916612 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df443d567c4f6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.43417727 +0000 UTC m=+1.748595245,LastTimestamp:2026-03-18 14:00:00.43417727 +0000 UTC m=+1.748595245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.922999 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443d57b70de openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.435466462 +0000 UTC m=+1.749884437,LastTimestamp:2026-03-18 14:00:00.435466462 +0000 UTC m=+1.749884437,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.929558 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df443d59f0071 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.437796977 +0000 UTC m=+1.752214952,LastTimestamp:2026-03-18 14:00:00.437796977 +0000 UTC m=+1.752214952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.934794 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df443d5b7512f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.439390511 +0000 UTC m=+1.753808476,LastTimestamp:2026-03-18 14:00:00.439390511 +0000 UTC m=+1.753808476,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.939547 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df443d64d08ea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.44920241 +0000 UTC m=+1.763620385,LastTimestamp:2026-03-18 14:00:00.44920241 +0000 UTC m=+1.763620385,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.944686 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df443d661fbb3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.450575283 +0000 UTC m=+1.764993258,LastTimestamp:2026-03-18 14:00:00.450575283 +0000 UTC m=+1.764993258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.949923 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443e6ad5bf5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.723950581 +0000 UTC m=+2.038368546,LastTimestamp:2026-03-18 14:00:00.723950581 +0000 UTC m=+2.038368546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.955965 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443e73a4762 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.73318589 +0000 UTC m=+2.047603905,LastTimestamp:2026-03-18 14:00:00.73318589 +0000 UTC m=+2.047603905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.963482 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443e7486894 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.734111892 +0000 UTC m=+2.048529897,LastTimestamp:2026-03-18 14:00:00.734111892 +0000 UTC m=+2.048529897,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.970172 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443f1731f96 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.904683414 +0000 UTC m=+2.219101389,LastTimestamp:2026-03-18 14:00:00.904683414 +0000 UTC m=+2.219101389,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.975310 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443f2012871 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.913991793 +0000 UTC m=+2.228409768,LastTimestamp:2026-03-18 14:00:00.913991793 +0000 UTC m=+2.228409768,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.980969 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443f21048a3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.914983075 +0000 UTC m=+2.229401040,LastTimestamp:2026-03-18 14:00:00.914983075 +0000 UTC m=+2.229401040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.986644 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443fc2fa1fe openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.084809726 +0000 UTC m=+2.399227701,LastTimestamp:2026-03-18 14:00:01.084809726 +0000 UTC m=+2.399227701,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.990930 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443fce24807 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.096517639 +0000 UTC m=+2.410935614,LastTimestamp:2026-03-18 14:00:01.096517639 +0000 UTC m=+2.410935614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:43 crc kubenswrapper[4756]: E0318 14:00:43.995577 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df4440aef8da5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.332268453 +0000 UTC m=+2.646686428,LastTimestamp:2026-03-18 14:00:01.332268453 +0000 UTC m=+2.646686428,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.000353 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df4440b1c33a1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.335194529 +0000 UTC m=+2.649612504,LastTimestamp:2026-03-18 14:00:01.335194529 +0000 UTC m=+2.649612504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.006717 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df4440b491633 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.338136115 +0000 UTC m=+2.652554090,LastTimestamp:2026-03-18 14:00:01.338136115 +0000 UTC m=+2.652554090,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.011735 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df4440bb35316 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.345098518 +0000 UTC m=+2.659516503,LastTimestamp:2026-03-18 14:00:01.345098518 +0000 UTC m=+2.659516503,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.016317 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df4441920ff1b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.570389787 +0000 UTC m=+2.884807762,LastTimestamp:2026-03-18 14:00:01.570389787 +0000 UTC m=+2.884807762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.022389 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df44419377b12 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.571863314 +0000 UTC m=+2.886281289,LastTimestamp:2026-03-18 14:00:01.571863314 +0000 UTC m=+2.886281289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.028628 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df4441938b07c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.571942524 +0000 UTC m=+2.886360489,LastTimestamp:2026-03-18 14:00:01.571942524 +0000 UTC m=+2.886360489,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.033887 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df4441941f63f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.572550207 +0000 UTC m=+2.886968182,LastTimestamp:2026-03-18 14:00:01.572550207 +0000 UTC m=+2.886968182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.041258 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df44419cc9362 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.581634402 +0000 UTC m=+2.896052377,LastTimestamp:2026-03-18 14:00:01.581634402 +0000 UTC m=+2.896052377,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.049910 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df44419dd1949 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.582717257 +0000 UTC m=+2.897135232,LastTimestamp:2026-03-18 14:00:01.582717257 +0000 UTC m=+2.897135232,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.054611 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df4441a36c114 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.588592916 +0000 UTC m=+2.903010901,LastTimestamp:2026-03-18 14:00:01.588592916 +0000 UTC m=+2.903010901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.059723 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df4441a4d9266 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.590088294 +0000 UTC m=+2.904506269,LastTimestamp:2026-03-18 14:00:01.590088294 +0000 UTC m=+2.904506269,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.065099 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df4441a77b28e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.592849038 +0000 UTC m=+2.907267023,LastTimestamp:2026-03-18 14:00:01.592849038 +0000 UTC m=+2.907267023,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.069684 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189df4441a78dfce openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.592926158 +0000 UTC m=+2.907344133,LastTimestamp:2026-03-18 14:00:01.592926158 +0000 UTC m=+2.907344133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.074733 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df444246d6b04 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.759947524 +0000 UTC m=+3.074365499,LastTimestamp:2026-03-18 14:00:01.759947524 +0000 UTC m=+3.074365499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.079087 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df444246db0c8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.759965384 +0000 UTC m=+3.074383359,LastTimestamp:2026-03-18 14:00:01.759965384 +0000 UTC m=+3.074383359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.084344 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df444250b7cd7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.770306775 +0000 UTC m=+3.084724750,LastTimestamp:2026-03-18 14:00:01.770306775 +0000 UTC m=+3.084724750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.089166 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df444251a0cd3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.771261139 +0000 UTC m=+3.085679114,LastTimestamp:2026-03-18 14:00:01.771261139 +0000 UTC m=+3.085679114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.093141 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df444252f8467 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.772668007 +0000 UTC m=+3.087085972,LastTimestamp:2026-03-18 14:00:01.772668007 +0000 UTC m=+3.087085972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.097242 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df4442539ed6a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.77335025 +0000 UTC m=+3.087768215,LastTimestamp:2026-03-18 14:00:01.77335025 +0000 UTC m=+3.087768215,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.100710 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df4442e59ba3f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.926429247 +0000 UTC m=+3.240847222,LastTimestamp:2026-03-18 14:00:01.926429247 +0000 UTC m=+3.240847222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.105901 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df4442e8ffa3f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.929984575 +0000 UTC m=+3.244402550,LastTimestamp:2026-03-18 14:00:01.929984575 +0000 UTC m=+3.244402550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.110473 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df4442f066b19 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.937746713 +0000 UTC m=+3.252164698,LastTimestamp:2026-03-18 14:00:01.937746713 +0000 UTC m=+3.252164698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.113903 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df4442f1484c4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.938670788 +0000 UTC m=+3.253088763,LastTimestamp:2026-03-18 14:00:01.938670788 +0000 UTC m=+3.253088763,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.118096 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189df4442f38cfbf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:01.941049279 +0000 UTC m=+3.255467254,LastTimestamp:2026-03-18 14:00:01.941049279 +0000 UTC m=+3.255467254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.122449 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df44437d116e4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:02.085246692 +0000 UTC m=+3.399664667,LastTimestamp:2026-03-18 14:00:02.085246692 +0000 UTC m=+3.399664667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.126425 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df444387b32e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:02.096394977 +0000 UTC m=+3.410812952,LastTimestamp:2026-03-18 14:00:02.096394977 +0000 UTC m=+3.410812952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.131311 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df444388dcbc4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:02.097613764 +0000 UTC m=+3.412031739,LastTimestamp:2026-03-18 14:00:02.097613764 +0000 UTC m=+3.412031739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.139272 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df44441dd0264 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:02.253800036 +0000 UTC m=+3.568218021,LastTimestamp:2026-03-18 14:00:02.253800036 +0000 UTC m=+3.568218021,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.143761 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df444429b5c8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:02.266274958 +0000 UTC m=+3.580692943,LastTimestamp:2026-03-18 14:00:02.266274958 +0000 UTC m=+3.580692943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.153980 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df44447a3386b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:02.350676075 +0000 UTC m=+3.665094050,LastTimestamp:2026-03-18 14:00:02.350676075 +0000 UTC m=+3.665094050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.156228 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df44453b902b3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:02.553430707 +0000 UTC m=+3.867848692,LastTimestamp:2026-03-18 14:00:02.553430707 +0000 UTC m=+3.867848692,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.159826 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df44454bb30be openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:02.570350782 +0000 UTC m=+3.884768777,LastTimestamp:2026-03-18 14:00:02.570350782 +0000 UTC m=+3.884768777,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.164808 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df44483c32827 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:03.359402023 +0000 UTC m=+4.673820008,LastTimestamp:2026-03-18 14:00:03.359402023 +0000 UTC m=+4.673820008,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.168551 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df4448dda4db0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:03.52869112 +0000 UTC m=+4.843109135,LastTimestamp:2026-03-18 14:00:03.52869112 +0000 UTC m=+4.843109135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.171859 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df4448e8fe057 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:03.540590679 +0000 UTC m=+4.855008664,LastTimestamp:2026-03-18 14:00:03.540590679 +0000 UTC m=+4.855008664,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.175220 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df4448ea15508 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:03.541734664 +0000 UTC m=+4.856152649,LastTimestamp:2026-03-18 14:00:03.541734664 +0000 UTC m=+4.856152649,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.178871 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df4449b5f9fbe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:03.755532222 +0000 UTC m=+5.069950237,LastTimestamp:2026-03-18 14:00:03.755532222 +0000 UTC m=+5.069950237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.182471 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df4449be780e0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:03.764437216 +0000 UTC m=+5.078855201,LastTimestamp:2026-03-18 14:00:03.764437216 +0000 UTC m=+5.078855201,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.186820 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df4449bf7cb67 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:03.765504871 +0000 UTC m=+5.079922876,LastTimestamp:2026-03-18 14:00:03.765504871 +0000 UTC m=+5.079922876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.190983 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df444a84ea2ef openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:03.972522735 +0000 UTC m=+5.286940750,LastTimestamp:2026-03-18 14:00:03.972522735 +0000 UTC m=+5.286940750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.195135 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df444a9409036 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:03.988377654 +0000 UTC m=+5.302795659,LastTimestamp:2026-03-18 14:00:03.988377654 +0000 UTC m=+5.302795659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.198475 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df444a952714d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:03.989549389 +0000 UTC m=+5.303967394,LastTimestamp:2026-03-18 14:00:03.989549389 +0000 UTC m=+5.303967394,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.204673 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df444b76eebf4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:04.22629682 +0000 UTC m=+5.540714835,LastTimestamp:2026-03-18 14:00:04.22629682 +0000 UTC m=+5.540714835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.210833 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df444b813c979 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:04.237101433 +0000 UTC m=+5.551519408,LastTimestamp:2026-03-18 14:00:04.237101433 +0000 UTC m=+5.551519408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.214925 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df444b8287574 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:04.23845618 +0000 UTC m=+5.552874145,LastTimestamp:2026-03-18 14:00:04.23845618 +0000 UTC m=+5.552874145,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.220180 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df444c5358e0d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:04.457418253 +0000 UTC m=+5.771836248,LastTimestamp:2026-03-18 14:00:04.457418253 +0000 UTC m=+5.771836248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.225362 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189df444c63352dd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:04.474049245 +0000 UTC m=+5.788467260,LastTimestamp:2026-03-18 14:00:04.474049245 +0000 UTC m=+5.788467260,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.235954 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 14:00:44 crc kubenswrapper[4756]: &Event{ObjectMeta:{kube-controller-manager-crc.189df4458c4417d1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 14:00:44 crc kubenswrapper[4756]: body: Mar 18 14:00:44 crc kubenswrapper[4756]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:07.797037009 +0000 UTC m=+9.111455024,LastTimestamp:2026-03-18 14:00:07.797037009 +0000 UTC m=+9.111455024,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 14:00:44 crc kubenswrapper[4756]: > Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.241350 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df4458c4546df openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:07.797114591 +0000 UTC m=+9.111532606,LastTimestamp:2026-03-18 14:00:07.797114591 +0000 UTC m=+9.111532606,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.250690 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 14:00:44 crc kubenswrapper[4756]: &Event{ObjectMeta:{kube-apiserver-crc.189df446ebd3d06b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 14:00:44 crc kubenswrapper[4756]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 14:00:44 crc kubenswrapper[4756]: Mar 18 14:00:44 crc kubenswrapper[4756]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:13.695258731 +0000 UTC m=+15.009676716,LastTimestamp:2026-03-18 14:00:13.695258731 +0000 UTC m=+15.009676716,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 14:00:44 crc kubenswrapper[4756]: > Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.257452 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df446ebd49197 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:13.695308183 +0000 UTC m=+15.009726168,LastTimestamp:2026-03-18 14:00:13.695308183 +0000 UTC m=+15.009726168,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.260940 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df446ebd3d06b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 14:00:44 crc kubenswrapper[4756]: &Event{ObjectMeta:{kube-apiserver-crc.189df446ebd3d06b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 14:00:44 crc kubenswrapper[4756]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 14:00:44 crc kubenswrapper[4756]: Mar 18 14:00:44 crc kubenswrapper[4756]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:13.695258731 +0000 UTC m=+15.009676716,LastTimestamp:2026-03-18 14:00:13.702562283 +0000 UTC m=+15.016980268,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 14:00:44 crc kubenswrapper[4756]: > Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.262554 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df446ebd49197\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df446ebd49197 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:13.695308183 +0000 UTC m=+15.009726168,LastTimestamp:2026-03-18 14:00:13.702608154 +0000 UTC m=+15.017026139,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: I0318 14:00:44.262878 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.265209 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df444388dcbc4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df444388dcbc4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:02.097613764 +0000 UTC m=+3.412031739,LastTimestamp:2026-03-18 14:00:14.411744611 +0000 UTC m=+15.726162626,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.267956 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df44441dd0264\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df44441dd0264 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:02.253800036 +0000 UTC m=+3.568218021,LastTimestamp:2026-03-18 14:00:14.638225334 +0000 UTC m=+15.952643349,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.272432 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189df444429b5c8e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189df444429b5c8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:02.266274958 +0000 UTC m=+3.580692943,LastTimestamp:2026-03-18 14:00:14.652210872 +0000 UTC m=+15.966628897,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.279411 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 14:00:44 crc kubenswrapper[4756]: &Event{ObjectMeta:{kube-controller-manager-crc.189df447e052a3fa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 14:00:44 crc kubenswrapper[4756]: body: Mar 18 14:00:44 crc kubenswrapper[4756]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:17.79721113 +0000 UTC m=+19.111629135,LastTimestamp:2026-03-18 14:00:17.79721113 +0000 UTC m=+19.111629135,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 14:00:44 crc kubenswrapper[4756]: > Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.284187 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df447e053b26a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:17.797280362 +0000 UTC m=+19.111698377,LastTimestamp:2026-03-18 14:00:17.797280362 +0000 UTC m=+19.111698377,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.289877 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df447e052a3fa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 14:00:44 crc kubenswrapper[4756]: &Event{ObjectMeta:{kube-controller-manager-crc.189df447e052a3fa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 14:00:44 crc kubenswrapper[4756]: body: Mar 18 14:00:44 crc kubenswrapper[4756]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:17.79721113 +0000 UTC m=+19.111629135,LastTimestamp:2026-03-18 14:00:27.796722397 +0000 UTC m=+29.111140402,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 14:00:44 crc kubenswrapper[4756]: > Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.296696 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df447e053b26a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df447e053b26a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:17.797280362 +0000 UTC m=+19.111698377,LastTimestamp:2026-03-18 14:00:27.79677754 +0000 UTC m=+29.111195555,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.301856 4756 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df44a348222ce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:27.799544526 +0000 UTC m=+29.113962531,LastTimestamp:2026-03-18 14:00:27.799544526 +0000 UTC m=+29.113962531,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.308016 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df443d57b70de\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443d57b70de openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.435466462 +0000 UTC m=+1.749884437,LastTimestamp:2026-03-18 14:00:27.9173608 +0000 UTC m=+29.231778785,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.314315 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df443e6ad5bf5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443e6ad5bf5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.723950581 +0000 UTC m=+2.038368546,LastTimestamp:2026-03-18 14:00:28.120303352 +0000 UTC m=+29.434721367,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.318659 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df443e73a4762\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df443e73a4762 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:00.73318589 +0000 UTC m=+2.047603905,LastTimestamp:2026-03-18 14:00:28.129358063 +0000 UTC m=+29.443776068,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.325444 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df447e052a3fa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 14:00:44 crc kubenswrapper[4756]: &Event{ObjectMeta:{kube-controller-manager-crc.189df447e052a3fa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 14:00:44 crc kubenswrapper[4756]: body: Mar 18 14:00:44 crc kubenswrapper[4756]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:17.79721113 +0000 UTC m=+19.111629135,LastTimestamp:2026-03-18 14:00:37.798041242 +0000 UTC m=+39.112459277,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 14:00:44 crc kubenswrapper[4756]: > Mar 18 14:00:44 crc kubenswrapper[4756]: E0318 14:00:44.329910 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df447e053b26a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189df447e053b26a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:17.797280362 +0000 UTC m=+19.111698377,LastTimestamp:2026-03-18 14:00:37.798214276 +0000 UTC m=+39.112632341,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:00:45 crc kubenswrapper[4756]: I0318 14:00:45.268511 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:46 crc kubenswrapper[4756]: I0318 14:00:46.267929 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:47 crc kubenswrapper[4756]: I0318 14:00:47.267069 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:47 crc kubenswrapper[4756]: I0318 14:00:47.796928 4756 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:00:47 crc kubenswrapper[4756]: I0318 14:00:47.797053 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:00:47 crc kubenswrapper[4756]: E0318 14:00:47.804599 4756 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189df447e052a3fa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 14:00:47 crc kubenswrapper[4756]: &Event{ObjectMeta:{kube-controller-manager-crc.189df447e052a3fa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 14:00:47 crc kubenswrapper[4756]: body: Mar 18 14:00:47 crc kubenswrapper[4756]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:00:17.79721113 +0000 UTC m=+19.111629135,LastTimestamp:2026-03-18 14:00:47.79701355 +0000 UTC m=+49.111431555,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 14:00:47 crc kubenswrapper[4756]: > Mar 18 14:00:48 crc kubenswrapper[4756]: E0318 14:00:48.109328 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 14:00:48 crc kubenswrapper[4756]: I0318 14:00:48.113947 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:48 crc kubenswrapper[4756]: I0318 14:00:48.115416 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:48 crc kubenswrapper[4756]: I0318 14:00:48.115501 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:48 crc kubenswrapper[4756]: I0318 14:00:48.115529 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:48 crc kubenswrapper[4756]: I0318 14:00:48.115622 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 14:00:48 crc kubenswrapper[4756]: E0318 14:00:48.121694 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 14:00:48 crc kubenswrapper[4756]: I0318 14:00:48.267564 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:48 crc kubenswrapper[4756]: W0318 14:00:48.946103 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:48 crc kubenswrapper[4756]: E0318 14:00:48.946213 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 14:00:49 crc kubenswrapper[4756]: I0318 14:00:49.267263 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:49 crc kubenswrapper[4756]: E0318 14:00:49.390478 4756 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 14:00:50 crc kubenswrapper[4756]: I0318 14:00:50.267099 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:50 crc kubenswrapper[4756]: W0318 14:00:50.966594 4756 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 14:00:50 crc kubenswrapper[4756]: E0318 14:00:50.966662 4756 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 14:00:51 crc kubenswrapper[4756]: I0318 14:00:51.267514 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:52 crc kubenswrapper[4756]: I0318 14:00:52.266941 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:53 crc kubenswrapper[4756]: I0318 14:00:53.266495 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:53 crc kubenswrapper[4756]: I0318 14:00:53.315367 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:53 crc kubenswrapper[4756]: I0318 14:00:53.316645 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:53 crc kubenswrapper[4756]: I0318 14:00:53.316693 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:53 crc kubenswrapper[4756]: I0318 14:00:53.316703 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:53 crc kubenswrapper[4756]: I0318 14:00:53.317203 4756 scope.go:117] "RemoveContainer" containerID="b8806d786a8eb11f43617f498c78b7e1d20a8f0851548d2983f2bb37e7860906" Mar 18 14:00:53 crc kubenswrapper[4756]: E0318 14:00:53.317340 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:00:53 crc kubenswrapper[4756]: I0318 14:00:53.939456 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 14:00:53 crc kubenswrapper[4756]: I0318 14:00:53.939607 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:53 crc kubenswrapper[4756]: I0318 14:00:53.940627 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:53 crc kubenswrapper[4756]: I0318 14:00:53.940676 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:53 crc kubenswrapper[4756]: I0318 14:00:53.940687 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:54 crc kubenswrapper[4756]: I0318 14:00:54.266050 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:55 crc kubenswrapper[4756]: E0318 14:00:55.114180 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 14:00:55 crc kubenswrapper[4756]: I0318 14:00:55.122234 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:55 crc kubenswrapper[4756]: I0318 14:00:55.123490 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:55 crc kubenswrapper[4756]: I0318 14:00:55.123528 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:55 crc kubenswrapper[4756]: I0318 14:00:55.123539 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:55 crc kubenswrapper[4756]: I0318 14:00:55.123563 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 14:00:55 crc kubenswrapper[4756]: E0318 14:00:55.127877 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 14:00:55 crc kubenswrapper[4756]: I0318 14:00:55.266087 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:55 crc kubenswrapper[4756]: I0318 14:00:55.951593 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:00:55 crc kubenswrapper[4756]: I0318 14:00:55.951729 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:55 crc kubenswrapper[4756]: I0318 14:00:55.953018 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:55 crc kubenswrapper[4756]: I0318 14:00:55.953083 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:55 crc kubenswrapper[4756]: I0318 14:00:55.953098 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:55 crc kubenswrapper[4756]: I0318 14:00:55.955761 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:00:56 crc kubenswrapper[4756]: I0318 14:00:56.264920 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:56 crc kubenswrapper[4756]: I0318 14:00:56.561352 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:00:56 crc kubenswrapper[4756]: I0318 14:00:56.562250 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:00:56 crc kubenswrapper[4756]: I0318 14:00:56.562287 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:00:56 crc kubenswrapper[4756]: I0318 14:00:56.562300 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:00:57 crc kubenswrapper[4756]: I0318 14:00:57.266272 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:58 crc kubenswrapper[4756]: I0318 14:00:58.264634 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:59 crc kubenswrapper[4756]: I0318 14:00:59.265541 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:00:59 crc kubenswrapper[4756]: E0318 14:00:59.390665 4756 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 14:01:00 crc kubenswrapper[4756]: I0318 14:01:00.265344 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:01:01 crc kubenswrapper[4756]: I0318 14:01:01.265298 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:01:02 crc kubenswrapper[4756]: E0318 14:01:02.121823 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 14:01:02 crc kubenswrapper[4756]: I0318 14:01:02.128825 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:01:02 crc kubenswrapper[4756]: I0318 14:01:02.130588 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:02 crc kubenswrapper[4756]: I0318 14:01:02.130625 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:02 crc kubenswrapper[4756]: I0318 14:01:02.130635 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:02 crc kubenswrapper[4756]: I0318 14:01:02.130660 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 14:01:02 crc kubenswrapper[4756]: E0318 14:01:02.137674 4756 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 14:01:02 crc kubenswrapper[4756]: I0318 14:01:02.268246 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:01:03 crc kubenswrapper[4756]: I0318 14:01:03.262107 4756 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 14:01:03 crc kubenswrapper[4756]: I0318 14:01:03.656885 4756 csr.go:261] certificate signing request csr-kf9lh is approved, waiting to be issued Mar 18 14:01:03 crc kubenswrapper[4756]: I0318 14:01:03.666099 4756 csr.go:257] certificate signing request csr-kf9lh is issued Mar 18 14:01:03 crc kubenswrapper[4756]: I0318 14:01:03.698692 4756 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 14:01:04 crc kubenswrapper[4756]: I0318 14:01:04.129825 4756 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 14:01:04 crc kubenswrapper[4756]: I0318 14:01:04.667987 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-16 19:46:30.592712008 +0000 UTC Mar 18 14:01:04 crc kubenswrapper[4756]: I0318 14:01:04.668058 4756 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7301h45m25.924661168s for next certificate rotation Mar 18 14:01:07 crc kubenswrapper[4756]: I0318 14:01:07.314694 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:01:07 crc kubenswrapper[4756]: I0318 14:01:07.315880 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:07 crc kubenswrapper[4756]: I0318 14:01:07.315907 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:07 crc kubenswrapper[4756]: I0318 14:01:07.315916 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:07 crc kubenswrapper[4756]: I0318 14:01:07.316468 4756 scope.go:117] "RemoveContainer" containerID="b8806d786a8eb11f43617f498c78b7e1d20a8f0851548d2983f2bb37e7860906" Mar 18 14:01:07 crc kubenswrapper[4756]: I0318 14:01:07.591354 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 14:01:07 crc kubenswrapper[4756]: I0318 14:01:07.593527 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a"} Mar 18 14:01:07 crc kubenswrapper[4756]: I0318 14:01:07.593669 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:01:07 crc kubenswrapper[4756]: I0318 14:01:07.594834 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:07 crc kubenswrapper[4756]: I0318 14:01:07.594887 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:07 crc kubenswrapper[4756]: I0318 14:01:07.594911 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:08 crc kubenswrapper[4756]: I0318 14:01:08.599175 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 14:01:08 crc kubenswrapper[4756]: I0318 14:01:08.600505 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 14:01:08 crc kubenswrapper[4756]: I0318 14:01:08.603089 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a" exitCode=255 Mar 18 14:01:08 crc kubenswrapper[4756]: I0318 14:01:08.603163 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a"} Mar 18 14:01:08 crc kubenswrapper[4756]: I0318 14:01:08.603253 4756 scope.go:117] "RemoveContainer" containerID="b8806d786a8eb11f43617f498c78b7e1d20a8f0851548d2983f2bb37e7860906" Mar 18 14:01:08 crc kubenswrapper[4756]: I0318 14:01:08.603416 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:01:08 crc kubenswrapper[4756]: I0318 14:01:08.604632 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:08 crc kubenswrapper[4756]: I0318 14:01:08.604680 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:08 crc kubenswrapper[4756]: I0318 14:01:08.604700 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:08 crc kubenswrapper[4756]: I0318 14:01:08.605654 4756 scope.go:117] "RemoveContainer" containerID="c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a" Mar 18 14:01:08 crc kubenswrapper[4756]: E0318 14:01:08.605994 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.138584 4756 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.140023 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.140060 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.140072 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.140190 4756 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.152735 4756 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.153063 4756 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 14:01:09 crc kubenswrapper[4756]: E0318 14:01:09.153097 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.159587 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.159633 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.159648 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.159667 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.159689 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:09Z","lastTransitionTime":"2026-03-18T14:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.175399 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:01:09 crc kubenswrapper[4756]: E0318 14:01:09.182003 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.190948 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.191194 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.191307 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.191411 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.191500 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:09Z","lastTransitionTime":"2026-03-18T14:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:09 crc kubenswrapper[4756]: E0318 14:01:09.204490 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.211092 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.211297 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.211420 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.211513 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.211591 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:09Z","lastTransitionTime":"2026-03-18T14:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:09 crc kubenswrapper[4756]: E0318 14:01:09.221091 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.228812 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.228856 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.228869 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.228889 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.228903 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:09Z","lastTransitionTime":"2026-03-18T14:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:09 crc kubenswrapper[4756]: E0318 14:01:09.238413 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:09 crc kubenswrapper[4756]: E0318 14:01:09.238560 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 14:01:09 crc kubenswrapper[4756]: E0318 14:01:09.238590 4756 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 14:01:09 crc kubenswrapper[4756]: E0318 14:01:09.339339 4756 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.387060 4756 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.442432 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.442493 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.442506 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.442526 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.442538 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:09Z","lastTransitionTime":"2026-03-18T14:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.545521 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.545569 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.545595 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.545616 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.545632 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:09Z","lastTransitionTime":"2026-03-18T14:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.607578 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.631473 4756 scope.go:117] "RemoveContainer" containerID="c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a" Mar 18 14:01:09 crc kubenswrapper[4756]: E0318 14:01:09.631706 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.649215 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.649268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.649283 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.649307 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.649321 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:09Z","lastTransitionTime":"2026-03-18T14:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.751347 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.751388 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.751417 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.751435 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.751448 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:09Z","lastTransitionTime":"2026-03-18T14:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.854374 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.854428 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.854445 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.854469 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.854485 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:09Z","lastTransitionTime":"2026-03-18T14:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.957553 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.957600 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.957613 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.957636 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:09 crc kubenswrapper[4756]: I0318 14:01:09.957649 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:09Z","lastTransitionTime":"2026-03-18T14:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.059802 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.059840 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.059851 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.059867 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.059880 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:10Z","lastTransitionTime":"2026-03-18T14:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.162727 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.162771 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.162781 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.162797 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.162811 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:10Z","lastTransitionTime":"2026-03-18T14:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.265329 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.265362 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.265370 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.265384 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.265394 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:10Z","lastTransitionTime":"2026-03-18T14:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.290718 4756 apiserver.go:52] "Watching apiserver" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.293926 4756 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.294474 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.295170 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.295206 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.295253 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.295333 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.295405 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.295640 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.295930 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.296054 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.296109 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.299906 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.300220 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.300273 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.300302 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.300313 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.300844 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.301052 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.301810 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.302088 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.327549 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.340551 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.356765 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.365220 4756 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.367905 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.367940 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.367950 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.367968 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.367978 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:10Z","lastTransitionTime":"2026-03-18T14:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.369267 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.384285 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.394815 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.408900 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.425448 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.440851 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.444575 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.444638 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.444680 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.444717 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.445535 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.445133 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.445361 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.445424 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.445590 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.445721 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.445760 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.445793 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.445827 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.445985 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446034 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446066 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446098 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446162 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446209 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446241 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446248 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446276 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446308 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446339 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446371 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446404 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446474 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446506 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446539 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446571 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446578 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446602 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.446801 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.447347 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.447395 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.447438 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.447634 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.447795 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.447940 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448026 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448395 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448457 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448493 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448540 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448592 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448599 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448638 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448673 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448721 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448728 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448763 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448809 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448844 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448890 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448933 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448968 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448963 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448988 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449000 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449015 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449201 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449274 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449334 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449384 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449430 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449452 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449482 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449489 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449548 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449597 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449637 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449698 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449751 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449770 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449815 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449862 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449905 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449963 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.449991 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.450021 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.450093 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.450176 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.450263 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.450507 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.450565 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.450603 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.450752 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.450857 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.450938 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.450972 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451183 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451317 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451430 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451544 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451659 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451768 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451874 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451973 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452089 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452227 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452334 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452448 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452560 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452676 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452796 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452925 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453030 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453156 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453283 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453399 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453503 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453601 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453700 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453792 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453886 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453994 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.454093 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.454233 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.454356 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.454487 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.454588 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.454677 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.454782 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.454888 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.455020 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.455153 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.455275 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.455379 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.455486 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.455593 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.455702 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451066 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451279 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451344 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451295 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451518 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451517 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.457789 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451548 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.457902 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.457917 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451728 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.457965 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.458013 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.459442 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.459497 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.459539 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.459572 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451771 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451801 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451971 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.451999 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462026 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462140 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462193 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462234 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462267 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462298 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462334 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462364 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462392 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462446 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462479 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462503 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462532 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462561 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462588 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462622 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462652 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462711 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462743 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462772 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462802 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462830 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462863 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462890 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462914 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.463024 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.463057 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.463082 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.463109 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452006 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452111 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452202 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452300 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452315 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452440 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452590 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452602 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452621 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452735 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.452962 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453087 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453226 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453427 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453660 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.453815 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.454173 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.454524 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.454773 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.455069 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.455376 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.455724 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.448249 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.456542 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.463503 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.456671 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.456860 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.457266 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.458019 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.460059 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.460080 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.460181 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.460725 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.460812 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.460867 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.461455 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462163 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462213 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.459783 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462335 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462509 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462740 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.462966 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.463141 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.463771 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.465347 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.465374 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.467971 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.466064 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.465961 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.466359 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.468013 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.466720 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.466721 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.467008 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.467401 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.467436 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.467726 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.467801 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.467868 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.465578 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.468074 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.468337 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.468861 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.468885 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.469010 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.469248 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.469464 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.468701 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.463161 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.469767 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.469828 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.469867 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.469912 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.469948 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.469777 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.469812 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.470003 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.470229 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.470637 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.470886 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:01:10.970275413 +0000 UTC m=+72.284693438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.470990 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471052 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471298 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471338 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471387 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471404 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471430 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471450 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:10Z","lastTransitionTime":"2026-03-18T14:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471575 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471353 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471760 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471894 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471956 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471997 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472040 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472082 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472148 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472191 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472227 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472263 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472307 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472343 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472386 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472428 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472469 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472506 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472540 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472575 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472609 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472645 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472701 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472764 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472812 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472846 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472885 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472920 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472955 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472992 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473026 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473064 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473100 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473164 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473205 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473243 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473284 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473320 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473355 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473389 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473424 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473463 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473499 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473536 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473571 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473607 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473642 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473678 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473715 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473782 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473822 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473858 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473898 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473933 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473974 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.471775 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.474013 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472243 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.472547 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473151 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473543 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.473980 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.474060 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.474112 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.474351 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.474584 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.474664 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.474717 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.474751 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.474850 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.474889 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.474905 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.474920 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475063 4756 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475089 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475105 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475136 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475151 4756 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475168 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475182 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475195 4756 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475209 4756 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475223 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475237 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475250 4756 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475262 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475275 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475289 4756 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475303 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475316 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475329 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475343 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475355 4756 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475370 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475384 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475398 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475411 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475426 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475441 4756 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475454 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475468 4756 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475483 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475496 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475510 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475526 4756 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475539 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475553 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475566 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475581 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475598 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475612 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475624 4756 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475637 4756 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475650 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475662 4756 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475659 4756 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476062 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475414 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475456 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475978 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476031 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476219 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.475674 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476542 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476568 4756 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.476590 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.476656 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:10.976636638 +0000 UTC m=+72.291054723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476750 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476591 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476814 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476828 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476842 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476856 4756 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476871 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476885 4756 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476898 4756 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476909 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476921 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476934 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476945 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476957 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476968 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476980 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.476993 4756 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477007 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477022 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477034 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477046 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477057 4756 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477068 4756 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477079 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477091 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477102 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477136 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477150 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477163 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477174 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477186 4756 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477201 4756 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477214 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477226 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477238 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477250 4756 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477261 4756 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477276 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477288 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477300 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477313 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477326 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477339 4756 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477353 4756 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477365 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477378 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477389 4756 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477402 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477413 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477428 4756 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477441 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477453 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477465 4756 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477477 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477488 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477501 4756 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477512 4756 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477524 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477537 4756 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477549 4756 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477563 4756 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477574 4756 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477586 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477598 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477611 4756 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477624 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477637 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477650 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477661 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477674 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477686 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477697 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477709 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477720 4756 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477734 4756 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477745 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477756 4756 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477767 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477779 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477792 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477806 4756 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477817 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477829 4756 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.477840 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.478291 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.478004 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.479193 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.479282 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.479584 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.479932 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.479959 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.480224 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.480385 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.480375 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.480549 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.480402 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.480506 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.480706 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.481195 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.481277 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.481717 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.481747 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.481739 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.482014 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.482184 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:10.981507331 +0000 UTC m=+72.295925346 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.482349 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.482603 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.482928 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.483249 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.485518 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.485993 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.486723 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.488091 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.489743 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.490989 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.491387 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.492776 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.492868 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.492929 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.493040 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:10.993021186 +0000 UTC m=+72.307439161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.493496 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.493519 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.493532 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.493585 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:10.993570531 +0000 UTC m=+72.307988506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.494309 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.494317 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.495102 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.495779 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.498572 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.498965 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.500980 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.501468 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.501977 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.502296 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.502749 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.502759 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.503762 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.508535 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.509577 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.509959 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.510039 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.510172 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.510211 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.510215 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.510271 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.510362 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.511539 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.512065 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.512290 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.512313 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.512431 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.512460 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.512558 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.512565 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.513052 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.513421 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.513794 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.514429 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.527544 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.527606 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.539645 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.574928 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.574970 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.574982 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.574999 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.575013 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:10Z","lastTransitionTime":"2026-03-18T14:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578487 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578586 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578626 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578665 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578692 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578715 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578740 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578761 4756 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578780 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578801 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578820 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578839 4756 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578858 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578878 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578897 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578913 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578932 4756 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578950 4756 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578841 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.578967 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579072 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579101 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579165 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579186 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579205 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579224 4756 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579249 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579267 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579285 4756 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579304 4756 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579322 4756 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579341 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579359 4756 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579377 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579395 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579418 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579435 4756 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579455 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579472 4756 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579490 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579509 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579527 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579545 4756 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579562 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579579 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579596 4756 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579613 4756 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579631 4756 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579663 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579699 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579719 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579742 4756 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579763 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579782 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579800 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579818 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579836 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579858 4756 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579875 4756 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579894 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579914 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579933 4756 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579950 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579968 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.579986 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.580004 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.580023 4756 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.580039 4756 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.580056 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.580074 4756 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.613044 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.614472 4756 scope.go:117] "RemoveContainer" containerID="c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.614713 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.621522 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.628817 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 14:01:10 crc kubenswrapper[4756]: W0318 14:01:10.636609 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-2e211911995e49e8d7e491556da4a2fdee1bba93391dc2b749aecf2a879772ed WatchSource:0}: Error finding container 2e211911995e49e8d7e491556da4a2fdee1bba93391dc2b749aecf2a879772ed: Status 404 returned error can't find the container with id 2e211911995e49e8d7e491556da4a2fdee1bba93391dc2b749aecf2a879772ed Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.639530 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.640507 4756 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 14:01:10 crc kubenswrapper[4756]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 14:01:10 crc kubenswrapper[4756]: if [[ -f "/env/_master" ]]; then Mar 18 14:01:10 crc kubenswrapper[4756]: set -o allexport Mar 18 14:01:10 crc kubenswrapper[4756]: source "/env/_master" Mar 18 14:01:10 crc kubenswrapper[4756]: set +o allexport Mar 18 14:01:10 crc kubenswrapper[4756]: fi Mar 18 14:01:10 crc kubenswrapper[4756]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 14:01:10 crc kubenswrapper[4756]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 14:01:10 crc kubenswrapper[4756]: ho_enable="--enable-hybrid-overlay" Mar 18 14:01:10 crc kubenswrapper[4756]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 14:01:10 crc kubenswrapper[4756]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 14:01:10 crc kubenswrapper[4756]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 14:01:10 crc kubenswrapper[4756]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 14:01:10 crc kubenswrapper[4756]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 14:01:10 crc kubenswrapper[4756]: --webhook-host=127.0.0.1 \ Mar 18 14:01:10 crc kubenswrapper[4756]: --webhook-port=9743 \ Mar 18 14:01:10 crc kubenswrapper[4756]: ${ho_enable} \ Mar 18 14:01:10 crc kubenswrapper[4756]: --enable-interconnect \ Mar 18 14:01:10 crc kubenswrapper[4756]: --disable-approver \ Mar 18 14:01:10 crc kubenswrapper[4756]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 14:01:10 crc kubenswrapper[4756]: --wait-for-kubernetes-api=200s \ Mar 18 14:01:10 crc kubenswrapper[4756]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 14:01:10 crc kubenswrapper[4756]: --loglevel="${LOGLEVEL}" Mar 18 14:01:10 crc kubenswrapper[4756]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 14:01:10 crc kubenswrapper[4756]: > logger="UnhandledError" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.642891 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.644289 4756 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 14:01:10 crc kubenswrapper[4756]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 14:01:10 crc kubenswrapper[4756]: if [[ -f "/env/_master" ]]; then Mar 18 14:01:10 crc kubenswrapper[4756]: set -o allexport Mar 18 14:01:10 crc kubenswrapper[4756]: source "/env/_master" Mar 18 14:01:10 crc kubenswrapper[4756]: set +o allexport Mar 18 14:01:10 crc kubenswrapper[4756]: fi Mar 18 14:01:10 crc kubenswrapper[4756]: Mar 18 14:01:10 crc kubenswrapper[4756]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 14:01:10 crc kubenswrapper[4756]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 14:01:10 crc kubenswrapper[4756]: --disable-webhook \ Mar 18 14:01:10 crc kubenswrapper[4756]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 14:01:10 crc kubenswrapper[4756]: --loglevel="${LOGLEVEL}" Mar 18 14:01:10 crc kubenswrapper[4756]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 14:01:10 crc kubenswrapper[4756]: > logger="UnhandledError" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.646166 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.651092 4756 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 14:01:10 crc kubenswrapper[4756]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 14:01:10 crc kubenswrapper[4756]: set -o allexport Mar 18 14:01:10 crc kubenswrapper[4756]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 14:01:10 crc kubenswrapper[4756]: source /etc/kubernetes/apiserver-url.env Mar 18 14:01:10 crc kubenswrapper[4756]: else Mar 18 14:01:10 crc kubenswrapper[4756]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 14:01:10 crc kubenswrapper[4756]: exit 1 Mar 18 14:01:10 crc kubenswrapper[4756]: fi Mar 18 14:01:10 crc kubenswrapper[4756]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 14:01:10 crc kubenswrapper[4756]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 14:01:10 crc kubenswrapper[4756]: > logger="UnhandledError" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.652182 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.677981 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.678336 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.678440 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.678539 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.678626 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:10Z","lastTransitionTime":"2026-03-18T14:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.781827 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.781906 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.781916 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.781930 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.781942 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:10Z","lastTransitionTime":"2026-03-18T14:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.884139 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.884189 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.884201 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.884220 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.884234 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:10Z","lastTransitionTime":"2026-03-18T14:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.982622 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.982768 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.982901 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:01:11.982860978 +0000 UTC m=+73.297279073 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.983010 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.983069 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.983107 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:11.983079905 +0000 UTC m=+73.297497920 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.983256 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:10 crc kubenswrapper[4756]: E0318 14:01:10.983365 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:11.983351413 +0000 UTC m=+73.297769398 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.988768 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.988869 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.988888 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.988941 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:10 crc kubenswrapper[4756]: I0318 14:01:10.988960 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:10Z","lastTransitionTime":"2026-03-18T14:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.083857 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.083926 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.084053 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.084073 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.084087 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.084173 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:12.084156541 +0000 UTC m=+73.398574516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.084205 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.084257 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.084278 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.084389 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:12.084360136 +0000 UTC m=+73.398778151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.091447 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.091485 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.091494 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.091509 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.091518 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:11Z","lastTransitionTime":"2026-03-18T14:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.194036 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.194090 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.194100 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.194143 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.194163 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:11Z","lastTransitionTime":"2026-03-18T14:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.254132 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.296885 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.296938 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.296950 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.296969 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.296980 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:11Z","lastTransitionTime":"2026-03-18T14:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.319897 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.321809 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.324798 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.325684 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.326281 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.326813 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.327409 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.327951 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.328564 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.329141 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.329643 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.330348 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.330928 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.331503 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.331991 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.332521 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.333052 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.333431 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.333943 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.334485 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.337828 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.338409 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.338872 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.339870 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.340352 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.341406 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.341998 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.342861 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.343508 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.344357 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.344833 4756 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.344947 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.346822 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.347292 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.347661 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.349647 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.350732 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.351354 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.352631 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.353538 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.354597 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.355364 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.356696 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.357530 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.358728 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.359422 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.360753 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.361699 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.362780 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.363415 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.364569 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.365237 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.365984 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.367061 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.400501 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.400574 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.400585 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.400606 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.400620 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:11Z","lastTransitionTime":"2026-03-18T14:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.504327 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.504384 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.504393 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.504416 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.504429 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:11Z","lastTransitionTime":"2026-03-18T14:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.607967 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.608022 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.608034 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.608059 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.608073 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:11Z","lastTransitionTime":"2026-03-18T14:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.617881 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2e211911995e49e8d7e491556da4a2fdee1bba93391dc2b749aecf2a879772ed"} Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.620008 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1a7f8ec5c7a07b5e3028bddce9c1cedad3ad257e01d44dc89df0c30f645ab541"} Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.620215 4756 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 14:01:11 crc kubenswrapper[4756]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 14:01:11 crc kubenswrapper[4756]: if [[ -f "/env/_master" ]]; then Mar 18 14:01:11 crc kubenswrapper[4756]: set -o allexport Mar 18 14:01:11 crc kubenswrapper[4756]: source "/env/_master" Mar 18 14:01:11 crc kubenswrapper[4756]: set +o allexport Mar 18 14:01:11 crc kubenswrapper[4756]: fi Mar 18 14:01:11 crc kubenswrapper[4756]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 14:01:11 crc kubenswrapper[4756]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 14:01:11 crc kubenswrapper[4756]: ho_enable="--enable-hybrid-overlay" Mar 18 14:01:11 crc kubenswrapper[4756]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 14:01:11 crc kubenswrapper[4756]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 14:01:11 crc kubenswrapper[4756]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 14:01:11 crc kubenswrapper[4756]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 14:01:11 crc kubenswrapper[4756]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 14:01:11 crc kubenswrapper[4756]: --webhook-host=127.0.0.1 \ Mar 18 14:01:11 crc kubenswrapper[4756]: --webhook-port=9743 \ Mar 18 14:01:11 crc kubenswrapper[4756]: ${ho_enable} \ Mar 18 14:01:11 crc kubenswrapper[4756]: --enable-interconnect \ Mar 18 14:01:11 crc kubenswrapper[4756]: --disable-approver \ Mar 18 14:01:11 crc kubenswrapper[4756]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 14:01:11 crc kubenswrapper[4756]: --wait-for-kubernetes-api=200s \ Mar 18 14:01:11 crc kubenswrapper[4756]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 14:01:11 crc kubenswrapper[4756]: --loglevel="${LOGLEVEL}" Mar 18 14:01:11 crc kubenswrapper[4756]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 14:01:11 crc kubenswrapper[4756]: > logger="UnhandledError" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.623178 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e3ecf40d261384576a33ba78eb04055b54c04126af9f13485fdaf009271e47e8"} Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.623430 4756 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 14:01:11 crc kubenswrapper[4756]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 14:01:11 crc kubenswrapper[4756]: if [[ -f "/env/_master" ]]; then Mar 18 14:01:11 crc kubenswrapper[4756]: set -o allexport Mar 18 14:01:11 crc kubenswrapper[4756]: source "/env/_master" Mar 18 14:01:11 crc kubenswrapper[4756]: set +o allexport Mar 18 14:01:11 crc kubenswrapper[4756]: fi Mar 18 14:01:11 crc kubenswrapper[4756]: Mar 18 14:01:11 crc kubenswrapper[4756]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 14:01:11 crc kubenswrapper[4756]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 14:01:11 crc kubenswrapper[4756]: --disable-webhook \ Mar 18 14:01:11 crc kubenswrapper[4756]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 14:01:11 crc kubenswrapper[4756]: --loglevel="${LOGLEVEL}" Mar 18 14:01:11 crc kubenswrapper[4756]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 14:01:11 crc kubenswrapper[4756]: > logger="UnhandledError" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.623939 4756 scope.go:117] "RemoveContainer" containerID="c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a" Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.624109 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.624322 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.624930 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.625311 4756 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 14:01:11 crc kubenswrapper[4756]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 14:01:11 crc kubenswrapper[4756]: set -o allexport Mar 18 14:01:11 crc kubenswrapper[4756]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 14:01:11 crc kubenswrapper[4756]: source /etc/kubernetes/apiserver-url.env Mar 18 14:01:11 crc kubenswrapper[4756]: else Mar 18 14:01:11 crc kubenswrapper[4756]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 14:01:11 crc kubenswrapper[4756]: exit 1 Mar 18 14:01:11 crc kubenswrapper[4756]: fi Mar 18 14:01:11 crc kubenswrapper[4756]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 14:01:11 crc kubenswrapper[4756]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 14:01:11 crc kubenswrapper[4756]: > logger="UnhandledError" Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.625487 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.626565 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.632578 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.646315 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.658205 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.667076 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.677491 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.693862 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.704621 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.710517 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.710576 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.710594 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.710618 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.710641 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:11Z","lastTransitionTime":"2026-03-18T14:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.715672 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.725949 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.744723 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.759717 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.773750 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.781780 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.794982 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.813877 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.813938 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.813954 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.813976 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.813991 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:11Z","lastTransitionTime":"2026-03-18T14:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.916664 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.916722 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.916739 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.916765 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.916785 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:11Z","lastTransitionTime":"2026-03-18T14:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.993914 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.994035 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.994099 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:01:13.994060277 +0000 UTC m=+75.308478292 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.994185 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.994266 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:13.994247463 +0000 UTC m=+75.308665478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:11 crc kubenswrapper[4756]: I0318 14:01:11.994259 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.994393 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:11 crc kubenswrapper[4756]: E0318 14:01:11.994523 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:13.994499619 +0000 UTC m=+75.308917634 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.019825 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.019986 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.020007 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.020032 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.020049 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:12Z","lastTransitionTime":"2026-03-18T14:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.095356 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.095443 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:12 crc kubenswrapper[4756]: E0318 14:01:12.095624 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:12 crc kubenswrapper[4756]: E0318 14:01:12.095634 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:12 crc kubenswrapper[4756]: E0318 14:01:12.095694 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:12 crc kubenswrapper[4756]: E0318 14:01:12.095715 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:12 crc kubenswrapper[4756]: E0318 14:01:12.095651 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:12 crc kubenswrapper[4756]: E0318 14:01:12.095826 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:12 crc kubenswrapper[4756]: E0318 14:01:12.095840 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:14.095808191 +0000 UTC m=+75.410226196 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:12 crc kubenswrapper[4756]: E0318 14:01:12.095901 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:14.095875123 +0000 UTC m=+75.410293138 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.122930 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.122984 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.123000 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.123027 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.123046 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:12Z","lastTransitionTime":"2026-03-18T14:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.225949 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.226038 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.226064 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.226095 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.226258 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:12Z","lastTransitionTime":"2026-03-18T14:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.314421 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.314491 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:12 crc kubenswrapper[4756]: E0318 14:01:12.314583 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.314434 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:12 crc kubenswrapper[4756]: E0318 14:01:12.314832 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:12 crc kubenswrapper[4756]: E0318 14:01:12.314901 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.330023 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.330066 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.330080 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.330098 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.330112 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:12Z","lastTransitionTime":"2026-03-18T14:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.432374 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.432455 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.432468 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.432487 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.432500 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:12Z","lastTransitionTime":"2026-03-18T14:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.535198 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.535240 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.535251 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.535265 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.535274 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:12Z","lastTransitionTime":"2026-03-18T14:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.638205 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.638266 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.638283 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.638309 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.638327 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:12Z","lastTransitionTime":"2026-03-18T14:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.741016 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.741050 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.741059 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.741071 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.741080 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:12Z","lastTransitionTime":"2026-03-18T14:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.844048 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.844097 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.844160 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.844210 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.844231 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:12Z","lastTransitionTime":"2026-03-18T14:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.947311 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.947361 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.947372 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.947391 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:12 crc kubenswrapper[4756]: I0318 14:01:12.947403 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:12Z","lastTransitionTime":"2026-03-18T14:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.050716 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.050808 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.050826 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.050853 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.050869 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:13Z","lastTransitionTime":"2026-03-18T14:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.154303 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.154352 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.154371 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.154394 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.154410 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:13Z","lastTransitionTime":"2026-03-18T14:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.257746 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.257829 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.257857 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.257891 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.257918 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:13Z","lastTransitionTime":"2026-03-18T14:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.361601 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.361666 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.361691 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.361723 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.361749 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:13Z","lastTransitionTime":"2026-03-18T14:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.465110 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.465216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.465238 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.465266 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.465306 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:13Z","lastTransitionTime":"2026-03-18T14:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.567843 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.567910 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.567933 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.567960 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.567981 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:13Z","lastTransitionTime":"2026-03-18T14:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.670868 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.670914 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.670926 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.670942 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.670953 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:13Z","lastTransitionTime":"2026-03-18T14:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.773628 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.773691 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.773714 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.773742 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.773762 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:13Z","lastTransitionTime":"2026-03-18T14:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.876234 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.876300 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.876322 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.876346 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.876365 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:13Z","lastTransitionTime":"2026-03-18T14:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.979164 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.979238 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.979266 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.979296 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:13 crc kubenswrapper[4756]: I0318 14:01:13.979315 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:13Z","lastTransitionTime":"2026-03-18T14:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.015061 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.015238 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.015284 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.015361 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.015377 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:01:18.015349262 +0000 UTC m=+79.329767237 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.015427 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:18.015411484 +0000 UTC m=+79.329829469 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.015446 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.015515 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:18.015500387 +0000 UTC m=+79.329918372 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.081524 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.081574 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.081587 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.081604 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.081618 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:14Z","lastTransitionTime":"2026-03-18T14:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.116441 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.116507 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.116631 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.116653 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.116667 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.116663 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.116737 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.116756 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.116783 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:18.116768667 +0000 UTC m=+79.431186652 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.116817 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:18.116797848 +0000 UTC m=+79.431215863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.185725 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.185820 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.185850 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.185886 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.185912 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:14Z","lastTransitionTime":"2026-03-18T14:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.289415 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.289529 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.289570 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.289602 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.289624 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:14Z","lastTransitionTime":"2026-03-18T14:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.315166 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.315235 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.315285 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.315302 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.315419 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:14 crc kubenswrapper[4756]: E0318 14:01:14.315591 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.392762 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.392838 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.392857 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.392882 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.392899 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:14Z","lastTransitionTime":"2026-03-18T14:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.495618 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.495696 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.495719 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.495750 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.495774 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:14Z","lastTransitionTime":"2026-03-18T14:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.598719 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.598835 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.598912 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.598946 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.598974 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:14Z","lastTransitionTime":"2026-03-18T14:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.701964 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.702114 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.702347 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.702436 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.702477 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:14Z","lastTransitionTime":"2026-03-18T14:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.805499 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.805567 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.805584 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.805609 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.805626 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:14Z","lastTransitionTime":"2026-03-18T14:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.908301 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.908368 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.908391 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.908420 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:14 crc kubenswrapper[4756]: I0318 14:01:14.908444 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:14Z","lastTransitionTime":"2026-03-18T14:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.010453 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.010493 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.010503 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.010521 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.010532 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:15Z","lastTransitionTime":"2026-03-18T14:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.113581 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.113637 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.113653 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.113675 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.113690 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:15Z","lastTransitionTime":"2026-03-18T14:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.216739 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.216782 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.216792 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.216810 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.216821 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:15Z","lastTransitionTime":"2026-03-18T14:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.319581 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.319633 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.319652 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.319678 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.319701 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:15Z","lastTransitionTime":"2026-03-18T14:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.421789 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.421833 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.421844 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.421861 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.421874 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:15Z","lastTransitionTime":"2026-03-18T14:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.523720 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.523765 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.523777 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.523794 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.523808 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:15Z","lastTransitionTime":"2026-03-18T14:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.625676 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.625723 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.625734 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.625751 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.625762 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:15Z","lastTransitionTime":"2026-03-18T14:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.728761 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.728816 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.728826 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.728843 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.728854 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:15Z","lastTransitionTime":"2026-03-18T14:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.831114 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.831190 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.831202 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.831217 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.831228 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:15Z","lastTransitionTime":"2026-03-18T14:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.934163 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.934223 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.934230 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.934243 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:15 crc kubenswrapper[4756]: I0318 14:01:15.934252 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:15Z","lastTransitionTime":"2026-03-18T14:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.036086 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.036149 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.036161 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.036177 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.036189 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:16Z","lastTransitionTime":"2026-03-18T14:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.138364 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.138422 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.138438 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.138457 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.138469 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:16Z","lastTransitionTime":"2026-03-18T14:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.240748 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.240813 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.240838 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.240860 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.240875 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:16Z","lastTransitionTime":"2026-03-18T14:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.314472 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.314557 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.314514 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:16 crc kubenswrapper[4756]: E0318 14:01:16.314681 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:16 crc kubenswrapper[4756]: E0318 14:01:16.314884 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:16 crc kubenswrapper[4756]: E0318 14:01:16.315013 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.323399 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.343041 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.343087 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.343099 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.343136 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.343149 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:16Z","lastTransitionTime":"2026-03-18T14:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.444853 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.444887 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.444898 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.444914 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.444929 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:16Z","lastTransitionTime":"2026-03-18T14:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.547182 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.547254 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.547271 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.547287 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.547321 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:16Z","lastTransitionTime":"2026-03-18T14:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.648997 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.649051 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.649065 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.649080 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.649090 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:16Z","lastTransitionTime":"2026-03-18T14:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.750865 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.750918 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.750935 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.750955 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.750968 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:16Z","lastTransitionTime":"2026-03-18T14:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.852813 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.852851 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.852861 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.852877 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.852888 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:16Z","lastTransitionTime":"2026-03-18T14:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.955747 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.955832 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.955851 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.955876 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:16 crc kubenswrapper[4756]: I0318 14:01:16.955893 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:16Z","lastTransitionTime":"2026-03-18T14:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.058804 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.058847 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.058856 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.058872 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.058881 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:17Z","lastTransitionTime":"2026-03-18T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.164301 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.164382 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.164419 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.164451 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.164477 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:17Z","lastTransitionTime":"2026-03-18T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.267560 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.267613 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.267625 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.267643 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.267655 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:17Z","lastTransitionTime":"2026-03-18T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.369889 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.369937 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.369949 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.369966 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.369977 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:17Z","lastTransitionTime":"2026-03-18T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.472618 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.472663 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.472676 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.472693 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.472706 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:17Z","lastTransitionTime":"2026-03-18T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.575087 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.575161 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.575178 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.575194 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.575206 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:17Z","lastTransitionTime":"2026-03-18T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.678055 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.678104 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.678139 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.678159 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.678173 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:17Z","lastTransitionTime":"2026-03-18T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.780111 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.780176 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.780196 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.780213 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.780224 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:17Z","lastTransitionTime":"2026-03-18T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.882523 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.882552 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.882562 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.882576 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.882587 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:17Z","lastTransitionTime":"2026-03-18T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.985586 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.985701 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.985909 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.985939 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:17 crc kubenswrapper[4756]: I0318 14:01:17.985950 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:17Z","lastTransitionTime":"2026-03-18T14:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.053496 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.053597 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.053636 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.053709 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.053729 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.053710 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:01:26.053683048 +0000 UTC m=+87.368101023 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.053858 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:26.053842002 +0000 UTC m=+87.368260047 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.053880 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:26.053870043 +0000 UTC m=+87.368288118 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.088531 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.088587 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.088600 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.088620 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.088631 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:18Z","lastTransitionTime":"2026-03-18T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.154290 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.154366 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.154437 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.154452 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.154456 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.154467 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.154471 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.154476 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.154526 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:26.154511176 +0000 UTC m=+87.468929151 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.154543 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:26.154535297 +0000 UTC m=+87.468953272 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.190506 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.190544 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.190552 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.190592 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.190603 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:18Z","lastTransitionTime":"2026-03-18T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.292587 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.292624 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.292634 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.292649 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.292659 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:18Z","lastTransitionTime":"2026-03-18T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.315110 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.315175 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.315308 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.315360 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.315177 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:18 crc kubenswrapper[4756]: E0318 14:01:18.315430 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.395994 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.396063 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.396085 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.396108 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.396159 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:18Z","lastTransitionTime":"2026-03-18T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.499346 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.499383 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.499393 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.499409 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.499419 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:18Z","lastTransitionTime":"2026-03-18T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.602849 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.602892 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.602903 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.602922 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.602936 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:18Z","lastTransitionTime":"2026-03-18T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.705243 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.705278 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.705288 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.705300 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.705308 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:18Z","lastTransitionTime":"2026-03-18T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.807621 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.808212 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.808245 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.808264 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.808275 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:18Z","lastTransitionTime":"2026-03-18T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.910653 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.910788 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.910809 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.910828 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:18 crc kubenswrapper[4756]: I0318 14:01:18.910842 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:18Z","lastTransitionTime":"2026-03-18T14:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.013561 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.013611 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.013619 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.013639 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.013648 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.116338 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.116372 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.116380 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.116392 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.116404 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.218295 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.218339 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.218380 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.218397 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.218407 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.319849 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.319890 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.319903 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.319917 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.319927 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.325473 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.333721 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.345400 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.353744 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.363614 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.371154 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.381006 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.391286 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.422654 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.422710 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.422722 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.422738 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.422751 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.524665 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.524730 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.524747 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.524770 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.524788 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: E0318 14:01:19.539572 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.544641 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.544732 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.544751 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.544774 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.544791 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: E0318 14:01:19.562274 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.566314 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.566365 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.566378 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.566396 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.566410 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: E0318 14:01:19.578307 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.581768 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.581824 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.581837 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.581859 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.581873 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: E0318 14:01:19.592749 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.596207 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.596260 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.596273 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.596290 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.596304 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: E0318 14:01:19.605338 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:19 crc kubenswrapper[4756]: E0318 14:01:19.605527 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.607159 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.607197 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.607232 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.607251 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.607264 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.709826 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.709873 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.709884 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.709897 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.709906 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.812296 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.812374 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.812416 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.812449 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.812472 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.915360 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.915424 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.915448 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.915477 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:19 crc kubenswrapper[4756]: I0318 14:01:19.915496 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:19Z","lastTransitionTime":"2026-03-18T14:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.019069 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.019170 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.019212 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.019245 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.019272 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:20Z","lastTransitionTime":"2026-03-18T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.122278 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.122325 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.122336 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.122353 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.122368 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:20Z","lastTransitionTime":"2026-03-18T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.225179 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.225266 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.225285 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.225318 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.225337 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:20Z","lastTransitionTime":"2026-03-18T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.315373 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.315432 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.315497 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:20 crc kubenswrapper[4756]: E0318 14:01:20.315562 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:20 crc kubenswrapper[4756]: E0318 14:01:20.315625 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:20 crc kubenswrapper[4756]: E0318 14:01:20.315699 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.328086 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.328147 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.328159 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.328174 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.328186 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:20Z","lastTransitionTime":"2026-03-18T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.431344 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.431411 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.431429 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.431456 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.431474 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:20Z","lastTransitionTime":"2026-03-18T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.533951 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.534005 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.534018 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.534036 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.534050 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:20Z","lastTransitionTime":"2026-03-18T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.637221 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.637282 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.637300 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.637328 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.637345 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:20Z","lastTransitionTime":"2026-03-18T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.740449 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.740492 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.740506 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.740521 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.740531 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:20Z","lastTransitionTime":"2026-03-18T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.843163 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.843229 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.843245 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.843270 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.843287 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:20Z","lastTransitionTime":"2026-03-18T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.947108 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.947213 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.947231 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.947256 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:20 crc kubenswrapper[4756]: I0318 14:01:20.947276 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:20Z","lastTransitionTime":"2026-03-18T14:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.050282 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.050337 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.050353 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.050375 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.050391 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:21Z","lastTransitionTime":"2026-03-18T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.153447 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.153531 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.153550 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.153578 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.153593 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:21Z","lastTransitionTime":"2026-03-18T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.256584 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.256653 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.256670 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.256703 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.256722 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:21Z","lastTransitionTime":"2026-03-18T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.359089 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.359196 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.359215 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.359239 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.359258 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:21Z","lastTransitionTime":"2026-03-18T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.462552 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.462634 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.462659 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.462692 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.462716 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:21Z","lastTransitionTime":"2026-03-18T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.566674 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.566736 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.566755 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.566781 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.566799 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:21Z","lastTransitionTime":"2026-03-18T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.669565 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.669667 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.669681 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.669698 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.669712 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:21Z","lastTransitionTime":"2026-03-18T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.714228 4756 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.772575 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.772652 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.772676 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.772708 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.772730 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:21Z","lastTransitionTime":"2026-03-18T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.876015 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.876081 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.876155 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.876191 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.876217 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:21Z","lastTransitionTime":"2026-03-18T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.979794 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.979865 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.979882 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.979908 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:21 crc kubenswrapper[4756]: I0318 14:01:21.979933 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:21Z","lastTransitionTime":"2026-03-18T14:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.083423 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.083518 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.083540 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.083564 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.083581 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:22Z","lastTransitionTime":"2026-03-18T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.191833 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.191904 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.191922 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.191949 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.191966 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:22Z","lastTransitionTime":"2026-03-18T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.295057 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.295179 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.295206 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.295235 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.295255 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:22Z","lastTransitionTime":"2026-03-18T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.314386 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.314436 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.314459 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:22 crc kubenswrapper[4756]: E0318 14:01:22.314570 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:22 crc kubenswrapper[4756]: E0318 14:01:22.314767 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:22 crc kubenswrapper[4756]: E0318 14:01:22.314917 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.398694 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.398773 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.398793 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.398818 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.398840 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:22Z","lastTransitionTime":"2026-03-18T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.502644 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.502721 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.502746 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.502777 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.502797 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:22Z","lastTransitionTime":"2026-03-18T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.605751 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.605817 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.605834 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.605858 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.605879 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:22Z","lastTransitionTime":"2026-03-18T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.709112 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.709234 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.709266 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.709298 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.709324 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:22Z","lastTransitionTime":"2026-03-18T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.812042 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.812091 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.812112 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.812169 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.812186 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:22Z","lastTransitionTime":"2026-03-18T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.915793 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.915858 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.915870 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.915893 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:22 crc kubenswrapper[4756]: I0318 14:01:22.915907 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:22Z","lastTransitionTime":"2026-03-18T14:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.018757 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.018820 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.018832 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.018854 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.018894 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:23Z","lastTransitionTime":"2026-03-18T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.122377 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.122454 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.122474 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.122508 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.122533 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:23Z","lastTransitionTime":"2026-03-18T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.225602 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.225684 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.225708 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.225782 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.225807 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:23Z","lastTransitionTime":"2026-03-18T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.315172 4756 scope.go:117] "RemoveContainer" containerID="c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a" Mar 18 14:01:23 crc kubenswrapper[4756]: E0318 14:01:23.315369 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.328358 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.328395 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.328407 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.328420 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.328432 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:23Z","lastTransitionTime":"2026-03-18T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.430543 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.430580 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.430589 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.430603 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.430614 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:23Z","lastTransitionTime":"2026-03-18T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.533400 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.533506 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.533526 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.533552 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.533571 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:23Z","lastTransitionTime":"2026-03-18T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.636771 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.636846 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.636870 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.636897 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.636916 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:23Z","lastTransitionTime":"2026-03-18T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.703652 4756 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.740218 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.740262 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.740273 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.740288 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.740297 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:23Z","lastTransitionTime":"2026-03-18T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.842833 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.842913 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.842950 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.842982 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.843003 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:23Z","lastTransitionTime":"2026-03-18T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.945550 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.945598 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.945610 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.945626 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:23 crc kubenswrapper[4756]: I0318 14:01:23.945641 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:23Z","lastTransitionTime":"2026-03-18T14:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.049514 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.049581 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.049598 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.049626 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.049652 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:24Z","lastTransitionTime":"2026-03-18T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.153041 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.153165 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.153187 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.153221 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.153243 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:24Z","lastTransitionTime":"2026-03-18T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.256815 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.257113 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.257178 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.257215 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.257234 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:24Z","lastTransitionTime":"2026-03-18T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.314946 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.315022 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.314962 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:24 crc kubenswrapper[4756]: E0318 14:01:24.315221 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:24 crc kubenswrapper[4756]: E0318 14:01:24.315331 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:24 crc kubenswrapper[4756]: E0318 14:01:24.315685 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.360224 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.360281 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.360298 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.360321 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.360342 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:24Z","lastTransitionTime":"2026-03-18T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.463332 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.463428 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.463465 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.463497 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.463524 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:24Z","lastTransitionTime":"2026-03-18T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.566810 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.566877 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.566898 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.566927 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.566950 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:24Z","lastTransitionTime":"2026-03-18T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.662999 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789"} Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.663094 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1"} Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.669811 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.669853 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.669865 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.669883 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.669898 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:24Z","lastTransitionTime":"2026-03-18T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.676788 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.690080 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.699443 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.712436 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.724048 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.736056 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.751341 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.762838 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.771744 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.771790 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.771801 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.771820 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.771831 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:24Z","lastTransitionTime":"2026-03-18T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.874494 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.874526 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.874535 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.874547 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.874556 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:24Z","lastTransitionTime":"2026-03-18T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.977375 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.978213 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.978370 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.978500 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:24 crc kubenswrapper[4756]: I0318 14:01:24.978611 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:24Z","lastTransitionTime":"2026-03-18T14:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.020698 4756 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.082089 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.082372 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.082708 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.083044 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.083428 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:25Z","lastTransitionTime":"2026-03-18T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.186178 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.186249 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.186271 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.186303 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.186324 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:25Z","lastTransitionTime":"2026-03-18T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.290020 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.290095 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.290150 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.290178 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.290195 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:25Z","lastTransitionTime":"2026-03-18T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.392452 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.392513 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.392533 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.392558 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.392576 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:25Z","lastTransitionTime":"2026-03-18T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.494972 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.495038 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.495056 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.495081 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.495098 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:25Z","lastTransitionTime":"2026-03-18T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.597597 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.597653 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.597670 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.597694 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.597710 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:25Z","lastTransitionTime":"2026-03-18T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.700656 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.700719 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.700737 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.700759 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.700778 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:25Z","lastTransitionTime":"2026-03-18T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.803468 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.803515 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.803535 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.803555 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.803569 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:25Z","lastTransitionTime":"2026-03-18T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.906344 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.906407 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.906419 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.906437 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:25 crc kubenswrapper[4756]: I0318 14:01:25.906449 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:25Z","lastTransitionTime":"2026-03-18T14:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.009365 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.009397 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.009406 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.009420 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.009430 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:26Z","lastTransitionTime":"2026-03-18T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.112979 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.113053 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.113091 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.113143 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.113160 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:26Z","lastTransitionTime":"2026-03-18T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.129968 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.130099 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.130210 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.130316 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:01:42.130287667 +0000 UTC m=+103.444705682 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.130415 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.130443 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.130503 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:42.130481382 +0000 UTC m=+103.444899397 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.130565 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:42.130530783 +0000 UTC m=+103.444948798 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.217055 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.217142 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.217180 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.217216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.217243 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:26Z","lastTransitionTime":"2026-03-18T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.231746 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.231832 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.232028 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.232065 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.232074 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.232100 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.232108 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.232173 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.232231 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:42.232202615 +0000 UTC m=+103.546620630 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.232272 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:42.232250516 +0000 UTC m=+103.546668531 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.315403 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.315464 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.315492 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.315813 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.316110 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:26 crc kubenswrapper[4756]: E0318 14:01:26.316468 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.320282 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.320336 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.320356 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.320384 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.320459 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:26Z","lastTransitionTime":"2026-03-18T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.423488 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.423557 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.423581 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.423615 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.423646 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:26Z","lastTransitionTime":"2026-03-18T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.526155 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.526196 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.526206 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.526230 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.526242 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:26Z","lastTransitionTime":"2026-03-18T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.628615 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.628976 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.628988 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.629008 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.629020 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:26Z","lastTransitionTime":"2026-03-18T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.670950 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9"} Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.687323 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:26Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.708091 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:26Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.728301 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:26Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.732533 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.732606 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.732632 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.732658 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.732675 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:26Z","lastTransitionTime":"2026-03-18T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.747920 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:26Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.769115 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:26Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.789751 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:26Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.812371 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:26Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.833047 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:26Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.835554 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.835604 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.835614 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.835633 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.835652 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:26Z","lastTransitionTime":"2026-03-18T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.938696 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.938749 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.938768 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.938791 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:26 crc kubenswrapper[4756]: I0318 14:01:26.938811 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:26Z","lastTransitionTime":"2026-03-18T14:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.042411 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.042475 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.042495 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.042522 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.042539 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:27Z","lastTransitionTime":"2026-03-18T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.146166 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.146301 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.146383 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.146418 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.146511 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:27Z","lastTransitionTime":"2026-03-18T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.249641 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.249705 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.249729 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.249757 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.249778 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:27Z","lastTransitionTime":"2026-03-18T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.352229 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.352301 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.352318 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.352340 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.352356 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:27Z","lastTransitionTime":"2026-03-18T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.455039 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.455088 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.455105 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.455186 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.455209 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:27Z","lastTransitionTime":"2026-03-18T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.558537 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.558793 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.558920 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.559016 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.559158 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:27Z","lastTransitionTime":"2026-03-18T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.661828 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.661884 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.661902 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.661927 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.661946 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:27Z","lastTransitionTime":"2026-03-18T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.764825 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.764859 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.764868 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.764882 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.764891 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:27Z","lastTransitionTime":"2026-03-18T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.871676 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.871704 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.871711 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.871724 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.871733 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:27Z","lastTransitionTime":"2026-03-18T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.973664 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.973700 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.973713 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.973728 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:27 crc kubenswrapper[4756]: I0318 14:01:27.973743 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:27Z","lastTransitionTime":"2026-03-18T14:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.076795 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.076839 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.076853 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.076872 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.076884 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:28Z","lastTransitionTime":"2026-03-18T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.179324 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.179378 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.179402 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.179420 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.179433 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:28Z","lastTransitionTime":"2026-03-18T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.281594 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.281638 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.281668 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.281683 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.281692 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:28Z","lastTransitionTime":"2026-03-18T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.315083 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.315083 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:28 crc kubenswrapper[4756]: E0318 14:01:28.315234 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:28 crc kubenswrapper[4756]: E0318 14:01:28.315299 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.315102 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:28 crc kubenswrapper[4756]: E0318 14:01:28.315372 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.384337 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.384373 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.384384 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.384399 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.384410 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:28Z","lastTransitionTime":"2026-03-18T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.486611 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.486669 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.486681 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.486699 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.486712 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:28Z","lastTransitionTime":"2026-03-18T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.588724 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.588791 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.588802 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.588817 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.588825 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:28Z","lastTransitionTime":"2026-03-18T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.691718 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.691766 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.691804 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.691823 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.691834 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:28Z","lastTransitionTime":"2026-03-18T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.793945 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.794014 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.794025 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.794065 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.794079 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:28Z","lastTransitionTime":"2026-03-18T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.897111 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.897664 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.897860 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.898042 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:28 crc kubenswrapper[4756]: I0318 14:01:28.898262 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:28Z","lastTransitionTime":"2026-03-18T14:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.001717 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.001777 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.001792 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.001816 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.001833 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.103724 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.103761 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.103772 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.103789 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.103803 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.206857 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.207222 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.207442 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.207652 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.207850 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.310844 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.310905 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.310924 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.310949 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.310967 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.335315 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.344436 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.361387 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.376564 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.389734 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.400773 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.411527 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.412795 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.412846 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.412860 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.412881 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.412894 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.426516 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.437954 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.515200 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.515230 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.515238 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.515250 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.515259 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.617338 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.617416 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.617435 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.617462 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.617479 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.685361 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.696641 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.696689 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.696705 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.696723 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.696738 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.701425 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: E0318 14:01:29.710835 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.714602 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.714640 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.714650 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.714668 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.714680 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.717726 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: E0318 14:01:29.728707 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.732045 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.732072 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.732080 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.732092 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.732100 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.733849 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: E0318 14:01:29.745787 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.747650 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.748995 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.749020 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.749028 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.749042 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.749051 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.759822 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: E0318 14:01:29.762229 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.766004 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.766038 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.766052 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.766068 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.766080 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: E0318 14:01:29.778202 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: E0318 14:01:29.778520 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.780021 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.780049 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.780060 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.780075 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.780086 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.784765 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.798234 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.813539 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.828608 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.882832 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.882883 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.882904 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.882931 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.882952 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.985512 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.985572 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.985593 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.985615 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:29 crc kubenswrapper[4756]: I0318 14:01:29.985632 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:29Z","lastTransitionTime":"2026-03-18T14:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.089332 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.089415 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.089445 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.089474 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.089496 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:30Z","lastTransitionTime":"2026-03-18T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.192489 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.193202 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.193238 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.193258 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.193273 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:30Z","lastTransitionTime":"2026-03-18T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.296094 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.296158 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.296169 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.296196 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.296208 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:30Z","lastTransitionTime":"2026-03-18T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.314991 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.315073 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.315166 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:30 crc kubenswrapper[4756]: E0318 14:01:30.315175 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:30 crc kubenswrapper[4756]: E0318 14:01:30.315349 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:30 crc kubenswrapper[4756]: E0318 14:01:30.315384 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.398351 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.398389 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.398401 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.398415 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.398428 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:30Z","lastTransitionTime":"2026-03-18T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.501332 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.501369 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.501380 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.501397 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.501409 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:30Z","lastTransitionTime":"2026-03-18T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.603931 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.604014 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.604039 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.604069 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.604091 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:30Z","lastTransitionTime":"2026-03-18T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.705751 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.705804 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.705813 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.705827 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.705838 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:30Z","lastTransitionTime":"2026-03-18T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.808487 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.808550 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.808561 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.808578 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.808591 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:30Z","lastTransitionTime":"2026-03-18T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.911538 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.911581 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.911591 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.911606 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:30 crc kubenswrapper[4756]: I0318 14:01:30.911615 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:30Z","lastTransitionTime":"2026-03-18T14:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.013953 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.014007 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.014025 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.014050 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.014067 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:31Z","lastTransitionTime":"2026-03-18T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.117529 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.117567 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.117577 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.117593 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.117602 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:31Z","lastTransitionTime":"2026-03-18T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.219736 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.219769 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.219779 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.219793 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.219803 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:31Z","lastTransitionTime":"2026-03-18T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.323597 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.323703 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.323731 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.323765 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.323800 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:31Z","lastTransitionTime":"2026-03-18T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.426896 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.426936 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.426946 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.426963 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.426976 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:31Z","lastTransitionTime":"2026-03-18T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.529973 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.530308 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.530492 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.530673 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.530823 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:31Z","lastTransitionTime":"2026-03-18T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.633628 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.633668 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.633680 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.633695 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.633708 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:31Z","lastTransitionTime":"2026-03-18T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.737633 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.737687 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.737704 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.737731 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.737748 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:31Z","lastTransitionTime":"2026-03-18T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.839881 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.839945 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.839989 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.840016 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.840034 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:31Z","lastTransitionTime":"2026-03-18T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.942208 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.942244 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.942255 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.942273 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:31 crc kubenswrapper[4756]: I0318 14:01:31.942284 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:31Z","lastTransitionTime":"2026-03-18T14:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.044345 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.044408 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.044425 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.044449 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.044468 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:32Z","lastTransitionTime":"2026-03-18T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.147625 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.147698 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.147716 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.147746 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.147765 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:32Z","lastTransitionTime":"2026-03-18T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.250834 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.250903 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.250920 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.250947 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.250966 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:32Z","lastTransitionTime":"2026-03-18T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.315375 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.315414 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.315548 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:32 crc kubenswrapper[4756]: E0318 14:01:32.315778 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:32 crc kubenswrapper[4756]: E0318 14:01:32.315944 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:32 crc kubenswrapper[4756]: E0318 14:01:32.316215 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.354057 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.354155 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.354173 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.354203 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.354221 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:32Z","lastTransitionTime":"2026-03-18T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.456984 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.457031 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.457044 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.457061 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.457073 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:32Z","lastTransitionTime":"2026-03-18T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.559510 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.559582 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.559602 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.559624 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.559641 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:32Z","lastTransitionTime":"2026-03-18T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.662255 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.662312 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.662330 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.662352 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.662370 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:32Z","lastTransitionTime":"2026-03-18T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.763881 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.763919 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.763930 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.763946 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.763957 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:32Z","lastTransitionTime":"2026-03-18T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.866769 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.866846 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.866863 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.866887 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.866906 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:32Z","lastTransitionTime":"2026-03-18T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.969967 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.970023 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.970035 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.970056 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:32 crc kubenswrapper[4756]: I0318 14:01:32.970068 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:32Z","lastTransitionTime":"2026-03-18T14:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.072580 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.072643 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.072689 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.072713 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.072730 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:33Z","lastTransitionTime":"2026-03-18T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.174951 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.174993 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.175005 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.175021 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.175033 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:33Z","lastTransitionTime":"2026-03-18T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.277890 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.277925 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.277935 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.277949 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.277961 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:33Z","lastTransitionTime":"2026-03-18T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.380308 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.380425 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.380444 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.380470 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.380487 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:33Z","lastTransitionTime":"2026-03-18T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.483595 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.483674 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.483686 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.483706 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.483717 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:33Z","lastTransitionTime":"2026-03-18T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.585849 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.585884 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.585894 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.585909 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.585919 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:33Z","lastTransitionTime":"2026-03-18T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.688666 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.688730 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.688742 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.688757 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.688768 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:33Z","lastTransitionTime":"2026-03-18T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.791860 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.791927 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.791945 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.791968 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.791984 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:33Z","lastTransitionTime":"2026-03-18T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.894575 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.894638 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.894656 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.894688 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.894704 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:33Z","lastTransitionTime":"2026-03-18T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.997157 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.997232 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.997263 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.997298 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:33 crc kubenswrapper[4756]: I0318 14:01:33.997322 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:33Z","lastTransitionTime":"2026-03-18T14:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.099927 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.099976 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.099985 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.100002 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.100015 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:34Z","lastTransitionTime":"2026-03-18T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.203377 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.203438 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.203456 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.203482 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.203500 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:34Z","lastTransitionTime":"2026-03-18T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.306435 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.306500 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.306522 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.306546 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.306563 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:34Z","lastTransitionTime":"2026-03-18T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.314874 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.314924 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:34 crc kubenswrapper[4756]: E0318 14:01:34.315092 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.315111 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:34 crc kubenswrapper[4756]: E0318 14:01:34.315289 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:34 crc kubenswrapper[4756]: E0318 14:01:34.315488 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.409871 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.409926 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.409947 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.409976 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.409998 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:34Z","lastTransitionTime":"2026-03-18T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.512419 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.512476 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.512489 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.512506 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.512519 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:34Z","lastTransitionTime":"2026-03-18T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.615400 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.615465 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.615489 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.615545 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.615569 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:34Z","lastTransitionTime":"2026-03-18T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.717290 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.717328 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.717342 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.717357 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.717366 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:34Z","lastTransitionTime":"2026-03-18T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.820099 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.820154 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.820168 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.820182 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.820193 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:34Z","lastTransitionTime":"2026-03-18T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.922714 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.922811 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.922842 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.922874 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:34 crc kubenswrapper[4756]: I0318 14:01:34.922897 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:34Z","lastTransitionTime":"2026-03-18T14:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.025539 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.025600 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.025617 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.025649 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.025666 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:35Z","lastTransitionTime":"2026-03-18T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.128252 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.128293 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.128304 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.128323 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.128336 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:35Z","lastTransitionTime":"2026-03-18T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.231649 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.232022 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.232215 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.232385 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.232527 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:35Z","lastTransitionTime":"2026-03-18T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.335554 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.335623 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.335646 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.335672 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.335693 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:35Z","lastTransitionTime":"2026-03-18T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.437567 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.437628 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.437648 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.437675 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.437694 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:35Z","lastTransitionTime":"2026-03-18T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.540782 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.540843 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.540861 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.540886 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.540903 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:35Z","lastTransitionTime":"2026-03-18T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.644062 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.644106 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.644159 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.644185 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.644203 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:35Z","lastTransitionTime":"2026-03-18T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.747512 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.747578 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.747602 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.747631 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.747658 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:35Z","lastTransitionTime":"2026-03-18T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.850394 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.850441 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.850453 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.850470 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.850482 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:35Z","lastTransitionTime":"2026-03-18T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.952850 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.952889 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.952898 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.952912 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:35 crc kubenswrapper[4756]: I0318 14:01:35.952923 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:35Z","lastTransitionTime":"2026-03-18T14:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.055638 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.055707 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.055724 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.055748 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.055764 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:36Z","lastTransitionTime":"2026-03-18T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.158240 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.158300 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.158317 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.158339 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.158358 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:36Z","lastTransitionTime":"2026-03-18T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.196790 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9xtp5"] Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.197061 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9xtp5" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.198852 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.199228 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.200968 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.210553 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.232106 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.246807 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.261436 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.261473 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.261485 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.261505 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.261519 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:36Z","lastTransitionTime":"2026-03-18T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.262869 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.279898 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.297834 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.310675 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.314409 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.314421 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:36 crc kubenswrapper[4756]: E0318 14:01:36.314510 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.314590 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:36 crc kubenswrapper[4756]: E0318 14:01:36.314758 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:36 crc kubenswrapper[4756]: E0318 14:01:36.314847 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.326588 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.326704 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c68d4bce-1eb0-4ec4-99ae-4e901a9720ef-hosts-file\") pod \"node-resolver-9xtp5\" (UID: \"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\") " pod="openshift-dns/node-resolver-9xtp5" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.326767 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vmzn\" (UniqueName: \"kubernetes.io/projected/c68d4bce-1eb0-4ec4-99ae-4e901a9720ef-kube-api-access-4vmzn\") pod \"node-resolver-9xtp5\" (UID: \"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\") " pod="openshift-dns/node-resolver-9xtp5" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.342751 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.356100 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.363847 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.363883 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.363894 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.363913 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.363927 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:36Z","lastTransitionTime":"2026-03-18T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.427350 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vmzn\" (UniqueName: \"kubernetes.io/projected/c68d4bce-1eb0-4ec4-99ae-4e901a9720ef-kube-api-access-4vmzn\") pod \"node-resolver-9xtp5\" (UID: \"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\") " pod="openshift-dns/node-resolver-9xtp5" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.427407 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c68d4bce-1eb0-4ec4-99ae-4e901a9720ef-hosts-file\") pod \"node-resolver-9xtp5\" (UID: \"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\") " pod="openshift-dns/node-resolver-9xtp5" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.427498 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c68d4bce-1eb0-4ec4-99ae-4e901a9720ef-hosts-file\") pod \"node-resolver-9xtp5\" (UID: \"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\") " pod="openshift-dns/node-resolver-9xtp5" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.450660 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vmzn\" (UniqueName: \"kubernetes.io/projected/c68d4bce-1eb0-4ec4-99ae-4e901a9720ef-kube-api-access-4vmzn\") pod \"node-resolver-9xtp5\" (UID: \"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\") " pod="openshift-dns/node-resolver-9xtp5" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.466622 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.466661 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.466677 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.466722 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.466737 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:36Z","lastTransitionTime":"2026-03-18T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.510887 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9xtp5" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.564870 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-wz5hm"] Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.565236 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-b9pzw"] Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.565413 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.566170 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-qvpkg"] Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.568501 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.568670 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.568715 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.568803 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.568938 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.579798 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.579862 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.579885 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.579913 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.579934 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:36Z","lastTransitionTime":"2026-03-18T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.580347 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.580519 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.580346 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.585410 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.585488 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.586222 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.586715 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.587237 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.588264 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.607026 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.623809 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.638163 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.653493 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.665109 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.683243 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.683278 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.683289 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.683308 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.683329 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:36Z","lastTransitionTime":"2026-03-18T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.687060 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.700788 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.711276 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9xtp5" event={"ID":"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef","Type":"ContainerStarted","Data":"aee2615a36222307ef73681a07fdfc122c12a062c9d48a6a65661014c9f5771c"} Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.713042 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.727546 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.729817 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1c4e85b-faff-4aca-847c-f33570c542a1-system-cni-dir\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.729853 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1c4e85b-faff-4aca-847c-f33570c542a1-os-release\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.729876 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-cnibin\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.729898 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d6fl\" (UniqueName: \"kubernetes.io/projected/13703604-4b4e-4eb2-b311-88457b667918-kube-api-access-8d6fl\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.729924 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xppvk\" (UniqueName: \"kubernetes.io/projected/c10ecbb9-ddab-48c7-9a86-abd122951622-kube-api-access-xppvk\") pod \"machine-config-daemon-qvpkg\" (UID: \"c10ecbb9-ddab-48c7-9a86-abd122951622\") " pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.729949 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1c4e85b-faff-4aca-847c-f33570c542a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.729969 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-run-netns\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.729988 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-var-lib-cni-multus\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730021 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1c4e85b-faff-4aca-847c-f33570c542a1-cnibin\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730043 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1c4e85b-faff-4aca-847c-f33570c542a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730066 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-run-multus-certs\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730086 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c10ecbb9-ddab-48c7-9a86-abd122951622-mcd-auth-proxy-config\") pod \"machine-config-daemon-qvpkg\" (UID: \"c10ecbb9-ddab-48c7-9a86-abd122951622\") " pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730107 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-os-release\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730156 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-var-lib-kubelet\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730177 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-multus-conf-dir\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730198 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-var-lib-cni-bin\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730227 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-multus-cni-dir\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730246 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-multus-socket-dir-parent\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730278 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47djd\" (UniqueName: \"kubernetes.io/projected/c1c4e85b-faff-4aca-847c-f33570c542a1-kube-api-access-47djd\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730335 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/13703604-4b4e-4eb2-b311-88457b667918-cni-binary-copy\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730401 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/13703604-4b4e-4eb2-b311-88457b667918-multus-daemon-config\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730431 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-run-k8s-cni-cncf-io\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730453 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-system-cni-dir\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730476 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-etc-kubernetes\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730528 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c10ecbb9-ddab-48c7-9a86-abd122951622-proxy-tls\") pod \"machine-config-daemon-qvpkg\" (UID: \"c10ecbb9-ddab-48c7-9a86-abd122951622\") " pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730584 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1c4e85b-faff-4aca-847c-f33570c542a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730608 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-hostroot\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.730641 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c10ecbb9-ddab-48c7-9a86-abd122951622-rootfs\") pod \"machine-config-daemon-qvpkg\" (UID: \"c10ecbb9-ddab-48c7-9a86-abd122951622\") " pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.739424 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.750716 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.765091 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.774941 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.783692 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.785591 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.785646 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.785656 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.785669 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.785678 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:36Z","lastTransitionTime":"2026-03-18T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.800002 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.810414 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.820413 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.829623 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831551 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/13703604-4b4e-4eb2-b311-88457b667918-cni-binary-copy\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831605 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-run-k8s-cni-cncf-io\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831621 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/13703604-4b4e-4eb2-b311-88457b667918-multus-daemon-config\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831636 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-system-cni-dir\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831665 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-hostroot\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831679 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-etc-kubernetes\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831694 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c10ecbb9-ddab-48c7-9a86-abd122951622-proxy-tls\") pod \"machine-config-daemon-qvpkg\" (UID: \"c10ecbb9-ddab-48c7-9a86-abd122951622\") " pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831711 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1c4e85b-faff-4aca-847c-f33570c542a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831765 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c10ecbb9-ddab-48c7-9a86-abd122951622-rootfs\") pod \"machine-config-daemon-qvpkg\" (UID: \"c10ecbb9-ddab-48c7-9a86-abd122951622\") " pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831785 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d6fl\" (UniqueName: \"kubernetes.io/projected/13703604-4b4e-4eb2-b311-88457b667918-kube-api-access-8d6fl\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831827 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1c4e85b-faff-4aca-847c-f33570c542a1-system-cni-dir\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831850 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1c4e85b-faff-4aca-847c-f33570c542a1-os-release\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831868 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-cnibin\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831883 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-var-lib-cni-multus\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831926 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xppvk\" (UniqueName: \"kubernetes.io/projected/c10ecbb9-ddab-48c7-9a86-abd122951622-kube-api-access-xppvk\") pod \"machine-config-daemon-qvpkg\" (UID: \"c10ecbb9-ddab-48c7-9a86-abd122951622\") " pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831949 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1c4e85b-faff-4aca-847c-f33570c542a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.831972 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-run-netns\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832010 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1c4e85b-faff-4aca-847c-f33570c542a1-system-cni-dir\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832023 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1c4e85b-faff-4aca-847c-f33570c542a1-cnibin\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832054 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1c4e85b-faff-4aca-847c-f33570c542a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832061 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-hostroot\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832089 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-run-multus-certs\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832074 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1c4e85b-faff-4aca-847c-f33570c542a1-cnibin\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832142 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1c4e85b-faff-4aca-847c-f33570c542a1-os-release\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832182 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-system-cni-dir\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832189 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-cnibin\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832071 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-run-multus-certs\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832237 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c10ecbb9-ddab-48c7-9a86-abd122951622-mcd-auth-proxy-config\") pod \"machine-config-daemon-qvpkg\" (UID: \"c10ecbb9-ddab-48c7-9a86-abd122951622\") " pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832244 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-var-lib-cni-multus\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832242 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-etc-kubernetes\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832311 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-os-release\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832265 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-os-release\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832401 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-var-lib-kubelet\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832456 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-multus-conf-dir\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832512 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-var-lib-cni-bin\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832573 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-multus-cni-dir\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832607 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-multus-socket-dir-parent\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832647 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1c4e85b-faff-4aca-847c-f33570c542a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832645 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47djd\" (UniqueName: \"kubernetes.io/projected/c1c4e85b-faff-4aca-847c-f33570c542a1-kube-api-access-47djd\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832700 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-run-netns\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832699 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/13703604-4b4e-4eb2-b311-88457b667918-multus-daemon-config\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832714 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c10ecbb9-ddab-48c7-9a86-abd122951622-rootfs\") pod \"machine-config-daemon-qvpkg\" (UID: \"c10ecbb9-ddab-48c7-9a86-abd122951622\") " pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832750 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-var-lib-cni-bin\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832783 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-var-lib-kubelet\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832810 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-multus-conf-dir\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832839 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-host-run-k8s-cni-cncf-io\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832887 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-multus-cni-dir\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832929 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/13703604-4b4e-4eb2-b311-88457b667918-multus-socket-dir-parent\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.832951 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1c4e85b-faff-4aca-847c-f33570c542a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.833070 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c10ecbb9-ddab-48c7-9a86-abd122951622-mcd-auth-proxy-config\") pod \"machine-config-daemon-qvpkg\" (UID: \"c10ecbb9-ddab-48c7-9a86-abd122951622\") " pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.833247 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1c4e85b-faff-4aca-847c-f33570c542a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.833290 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/13703604-4b4e-4eb2-b311-88457b667918-cni-binary-copy\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.840781 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c10ecbb9-ddab-48c7-9a86-abd122951622-proxy-tls\") pod \"machine-config-daemon-qvpkg\" (UID: \"c10ecbb9-ddab-48c7-9a86-abd122951622\") " pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.848227 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d6fl\" (UniqueName: \"kubernetes.io/projected/13703604-4b4e-4eb2-b311-88457b667918-kube-api-access-8d6fl\") pod \"multus-wz5hm\" (UID: \"13703604-4b4e-4eb2-b311-88457b667918\") " pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.848336 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.849615 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47djd\" (UniqueName: \"kubernetes.io/projected/c1c4e85b-faff-4aca-847c-f33570c542a1-kube-api-access-47djd\") pod \"multus-additional-cni-plugins-b9pzw\" (UID: \"c1c4e85b-faff-4aca-847c-f33570c542a1\") " pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.850458 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xppvk\" (UniqueName: \"kubernetes.io/projected/c10ecbb9-ddab-48c7-9a86-abd122951622-kube-api-access-xppvk\") pod \"machine-config-daemon-qvpkg\" (UID: \"c10ecbb9-ddab-48c7-9a86-abd122951622\") " pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.863522 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.878254 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.887641 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.887674 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.887683 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.887698 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.887708 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:36Z","lastTransitionTime":"2026-03-18T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.892305 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.897528 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wz5hm" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.906416 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.907419 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: W0318 14:01:36.909469 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13703604_4b4e_4eb2_b311_88457b667918.slice/crio-fc7bc568f928d1d3f5d27d2d0a45ac1736f2c02ed921eb4ddab61a52d302a7e2 WatchSource:0}: Error finding container fc7bc568f928d1d3f5d27d2d0a45ac1736f2c02ed921eb4ddab61a52d302a7e2: Status 404 returned error can't find the container with id fc7bc568f928d1d3f5d27d2d0a45ac1736f2c02ed921eb4ddab61a52d302a7e2 Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.914279 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:01:36 crc kubenswrapper[4756]: W0318 14:01:36.918743 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1c4e85b_faff_4aca_847c_f33570c542a1.slice/crio-7efaa2d7f7ec34cf306ecd9cbb69e63c5b29eb9f3e5f61f8cc5aa66feef5022d WatchSource:0}: Error finding container 7efaa2d7f7ec34cf306ecd9cbb69e63c5b29eb9f3e5f61f8cc5aa66feef5022d: Status 404 returned error can't find the container with id 7efaa2d7f7ec34cf306ecd9cbb69e63c5b29eb9f3e5f61f8cc5aa66feef5022d Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.919994 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: W0318 14:01:36.928502 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc10ecbb9_ddab_48c7_9a86_abd122951622.slice/crio-e80e70c4e58ea767e5294978a8dac44bb3ce5b2aa79e8315296818111f829e66 WatchSource:0}: Error finding container e80e70c4e58ea767e5294978a8dac44bb3ce5b2aa79e8315296818111f829e66: Status 404 returned error can't find the container with id e80e70c4e58ea767e5294978a8dac44bb3ce5b2aa79e8315296818111f829e66 Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.950770 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hgh2m"] Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.952180 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.954593 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.954617 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.954930 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.955111 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.955153 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.955349 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.956951 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.966494 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.991883 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.991930 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.991944 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.991964 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.991978 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:36Z","lastTransitionTime":"2026-03-18T14:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:36 crc kubenswrapper[4756]: I0318 14:01:36.992634 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:36Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.007763 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.022831 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.033719 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.033783 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-etc-openvswitch\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.033804 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-openvswitch\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.033837 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-slash\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.033860 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-log-socket\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.033882 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-systemd-units\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.033905 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-ovnkube-script-lib\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.033927 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-cni-netd\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.033946 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-env-overrides\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.033978 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-kubelet\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.034027 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7cf6c03-98fc-4724-acde-a38f32f87496-ovn-node-metrics-cert\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.034156 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-cni-bin\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.034228 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-var-lib-openvswitch\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.034251 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-run-netns\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.034270 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q6cv\" (UniqueName: \"kubernetes.io/projected/c7cf6c03-98fc-4724-acde-a38f32f87496-kube-api-access-9q6cv\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.034314 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-run-ovn-kubernetes\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.034342 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-node-log\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.034375 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-systemd\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.034395 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-ovn\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.034415 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-ovnkube-config\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.038836 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.053575 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.067418 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.082996 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.094537 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.094689 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.094710 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.094723 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.094733 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:37Z","lastTransitionTime":"2026-03-18T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.095406 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.106678 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.116530 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.134862 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-ovnkube-config\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.134921 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-systemd\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.134955 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-ovn\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.134989 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-etc-openvswitch\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135021 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-openvswitch\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135054 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135102 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-slash\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135169 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-log-socket\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135222 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-systemd-units\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135287 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-ovnkube-script-lib\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135357 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-cni-netd\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135411 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-env-overrides\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135448 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-kubelet\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135482 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7cf6c03-98fc-4724-acde-a38f32f87496-ovn-node-metrics-cert\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135533 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-cni-bin\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135586 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-var-lib-openvswitch\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135638 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q6cv\" (UniqueName: \"kubernetes.io/projected/c7cf6c03-98fc-4724-acde-a38f32f87496-kube-api-access-9q6cv\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135682 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-run-netns\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135720 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-run-ovn-kubernetes\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135778 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-node-log\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135883 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-node-log\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135879 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-ovnkube-config\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.135981 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-cni-netd\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.136465 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-ovnkube-script-lib\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.136546 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-systemd\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.136580 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-ovn\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.136603 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-etc-openvswitch\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.136648 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-openvswitch\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.136670 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.136711 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-slash\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.136734 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-log-socket\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.136756 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-systemd-units\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.136802 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-var-lib-openvswitch\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.136826 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-kubelet\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.137407 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-run-netns\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.137411 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-cni-bin\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.137446 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-run-ovn-kubernetes\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.138336 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-env-overrides\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.139450 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.140684 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7cf6c03-98fc-4724-acde-a38f32f87496-ovn-node-metrics-cert\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.153104 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.155510 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q6cv\" (UniqueName: \"kubernetes.io/projected/c7cf6c03-98fc-4724-acde-a38f32f87496-kube-api-access-9q6cv\") pod \"ovnkube-node-hgh2m\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.166059 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.196804 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.196839 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.196848 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.196863 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.196871 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:37Z","lastTransitionTime":"2026-03-18T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.299654 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.299722 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.299739 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.299767 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.299787 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:37Z","lastTransitionTime":"2026-03-18T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.315506 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.316585 4756 scope.go:117] "RemoveContainer" containerID="c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a" Mar 18 14:01:37 crc kubenswrapper[4756]: E0318 14:01:37.316869 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 14:01:37 crc kubenswrapper[4756]: W0318 14:01:37.337458 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7cf6c03_98fc_4724_acde_a38f32f87496.slice/crio-c720882114d76f7e173d8ca834be30e689bace91f86da99fabcbad99c5d5c213 WatchSource:0}: Error finding container c720882114d76f7e173d8ca834be30e689bace91f86da99fabcbad99c5d5c213: Status 404 returned error can't find the container with id c720882114d76f7e173d8ca834be30e689bace91f86da99fabcbad99c5d5c213 Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.401861 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.401926 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.401943 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.401969 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.401992 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:37Z","lastTransitionTime":"2026-03-18T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.505658 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.505714 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.505731 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.505754 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.505771 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:37Z","lastTransitionTime":"2026-03-18T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.610900 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.610941 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.610952 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.610968 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.610979 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:37Z","lastTransitionTime":"2026-03-18T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.713806 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.713865 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.713876 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.713893 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.713905 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:37Z","lastTransitionTime":"2026-03-18T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.718088 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9xtp5" event={"ID":"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef","Type":"ContainerStarted","Data":"1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.730429 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.730479 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.730493 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"e80e70c4e58ea767e5294978a8dac44bb3ce5b2aa79e8315296818111f829e66"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.734391 4756 generic.go:334] "Generic (PLEG): container finished" podID="c1c4e85b-faff-4aca-847c-f33570c542a1" containerID="b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f" exitCode=0 Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.734480 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" event={"ID":"c1c4e85b-faff-4aca-847c-f33570c542a1","Type":"ContainerDied","Data":"b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.734508 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" event={"ID":"c1c4e85b-faff-4aca-847c-f33570c542a1","Type":"ContainerStarted","Data":"7efaa2d7f7ec34cf306ecd9cbb69e63c5b29eb9f3e5f61f8cc5aa66feef5022d"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.736746 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerID="78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f" exitCode=0 Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.736800 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.736822 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerStarted","Data":"c720882114d76f7e173d8ca834be30e689bace91f86da99fabcbad99c5d5c213"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.741754 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.743207 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wz5hm" event={"ID":"13703604-4b4e-4eb2-b311-88457b667918","Type":"ContainerStarted","Data":"2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.743252 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wz5hm" event={"ID":"13703604-4b4e-4eb2-b311-88457b667918","Type":"ContainerStarted","Data":"fc7bc568f928d1d3f5d27d2d0a45ac1736f2c02ed921eb4ddab61a52d302a7e2"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.761835 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.784618 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.801390 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.816384 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.816429 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.816441 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.816458 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.816470 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:37Z","lastTransitionTime":"2026-03-18T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.819057 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.831770 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.843957 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.856420 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.869070 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.879887 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.900288 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.913614 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.918388 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.918424 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.918434 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.918449 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.918459 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:37Z","lastTransitionTime":"2026-03-18T14:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.925617 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.938308 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.955815 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.970041 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.983603 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:37 crc kubenswrapper[4756]: I0318 14:01:37.997913 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:37Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.010635 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.021162 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.021212 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.021223 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.021239 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.021251 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:38Z","lastTransitionTime":"2026-03-18T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.030937 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.044236 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.058789 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.074430 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.084920 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.093446 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.112733 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.124305 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.124360 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.124371 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.124392 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.124404 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:38Z","lastTransitionTime":"2026-03-18T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.127201 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.141086 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.226749 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.226779 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.226787 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.226800 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.226810 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:38Z","lastTransitionTime":"2026-03-18T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.314852 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.314888 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.314935 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:38 crc kubenswrapper[4756]: E0318 14:01:38.314977 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:38 crc kubenswrapper[4756]: E0318 14:01:38.315051 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:38 crc kubenswrapper[4756]: E0318 14:01:38.315165 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.329555 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.329612 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.329624 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.329644 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.329655 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:38Z","lastTransitionTime":"2026-03-18T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.432470 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.432504 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.432513 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.432525 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.432534 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:38Z","lastTransitionTime":"2026-03-18T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.535060 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.535369 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.535379 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.535392 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.535401 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:38Z","lastTransitionTime":"2026-03-18T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.638081 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.638175 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.638193 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.638215 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.638232 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:38Z","lastTransitionTime":"2026-03-18T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.741011 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.741044 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.741052 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.741064 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.741072 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:38Z","lastTransitionTime":"2026-03-18T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.747510 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" event={"ID":"c1c4e85b-faff-4aca-847c-f33570c542a1","Type":"ContainerStarted","Data":"2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.751081 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerStarted","Data":"19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.751178 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerStarted","Data":"a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.751194 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerStarted","Data":"b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.751235 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerStarted","Data":"89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.765014 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.780383 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.795422 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.809211 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.820371 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.829922 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.843691 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.843742 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.843754 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.843772 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.843783 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:38Z","lastTransitionTime":"2026-03-18T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.847997 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.861672 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.873504 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.887521 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.902701 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.919276 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.931701 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.946138 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.946418 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.946504 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.946605 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.946666 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:38Z","lastTransitionTime":"2026-03-18T14:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:38 crc kubenswrapper[4756]: I0318 14:01:38.948661 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:38Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.049082 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.049342 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.049404 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.049485 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.049545 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:39Z","lastTransitionTime":"2026-03-18T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.152406 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.152481 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.152504 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.152532 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.152553 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:39Z","lastTransitionTime":"2026-03-18T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.255831 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.255901 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.255926 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.255957 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.255981 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:39Z","lastTransitionTime":"2026-03-18T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.334718 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.348090 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.359698 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.359737 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.359748 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.359764 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.359775 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:39Z","lastTransitionTime":"2026-03-18T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.361335 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.394677 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.413444 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.432974 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.446145 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.458818 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.461541 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.461588 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.461605 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.461626 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.461641 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:39Z","lastTransitionTime":"2026-03-18T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.474863 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.486820 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.508240 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.523419 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.535518 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.545988 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.563321 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.563348 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.563356 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.563370 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.563379 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:39Z","lastTransitionTime":"2026-03-18T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.665184 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.665246 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.665268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.665301 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.665323 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:39Z","lastTransitionTime":"2026-03-18T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.763814 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerStarted","Data":"1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad"} Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.763895 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerStarted","Data":"fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b"} Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.767327 4756 generic.go:334] "Generic (PLEG): container finished" podID="c1c4e85b-faff-4aca-847c-f33570c542a1" containerID="2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553" exitCode=0 Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.767406 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" event={"ID":"c1c4e85b-faff-4aca-847c-f33570c542a1","Type":"ContainerDied","Data":"2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553"} Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.767872 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.767910 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.767925 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.767944 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.767957 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:39Z","lastTransitionTime":"2026-03-18T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.788103 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.811968 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.826770 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.842995 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.853711 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.882884 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.882936 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.882949 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.882969 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.882982 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:39Z","lastTransitionTime":"2026-03-18T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.905418 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.936575 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.955813 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.968053 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.983542 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.987284 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.987315 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.987326 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.987341 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.987352 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:39Z","lastTransitionTime":"2026-03-18T14:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:39 crc kubenswrapper[4756]: I0318 14:01:39.996369 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.011868 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.025602 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.037682 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.089419 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.089820 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.089831 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.089848 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.089860 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.103280 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.103312 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.103322 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.103336 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.103347 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: E0318 14:01:40.114930 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.117904 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.117937 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.117946 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.117957 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.117966 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: E0318 14:01:40.131678 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.134854 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.134900 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.134913 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.134930 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.134943 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: E0318 14:01:40.147372 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.151303 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.151356 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.151369 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.151391 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.151405 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: E0318 14:01:40.164021 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.167720 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.167775 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.167793 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.167816 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.167835 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: E0318 14:01:40.181509 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: E0318 14:01:40.181697 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.191900 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.191920 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.191927 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.191940 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.191949 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.293767 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.293803 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.293811 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.293823 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.293833 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.315225 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.315262 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.315303 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:40 crc kubenswrapper[4756]: E0318 14:01:40.315397 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:40 crc kubenswrapper[4756]: E0318 14:01:40.315469 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:40 crc kubenswrapper[4756]: E0318 14:01:40.315604 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.397057 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.397147 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.397188 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.397212 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.397232 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.500747 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.500813 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.500830 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.500854 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.500873 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.604183 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.604260 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.604284 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.604313 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.604334 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.707248 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.707314 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.707334 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.707366 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.707386 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.776054 4756 generic.go:334] "Generic (PLEG): container finished" podID="c1c4e85b-faff-4aca-847c-f33570c542a1" containerID="3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336" exitCode=0 Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.776165 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" event={"ID":"c1c4e85b-faff-4aca-847c-f33570c542a1","Type":"ContainerDied","Data":"3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336"} Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.797971 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.810030 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.810067 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.810076 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.810093 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.810103 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.815196 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.830054 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.849196 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.869476 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.884008 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.897252 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.910030 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.913672 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.913721 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.913736 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.913759 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.913774 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:40Z","lastTransitionTime":"2026-03-18T14:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.923355 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.933741 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.955627 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.966178 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.979496 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:40 crc kubenswrapper[4756]: I0318 14:01:40.990350 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:40Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.016150 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.016200 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.016216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.016238 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.016255 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:41Z","lastTransitionTime":"2026-03-18T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.118591 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.118633 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.118644 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.118661 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.118672 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:41Z","lastTransitionTime":"2026-03-18T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.221966 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.222017 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.222029 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.222048 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.222061 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:41Z","lastTransitionTime":"2026-03-18T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.324235 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.324310 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.324327 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.324351 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.324368 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:41Z","lastTransitionTime":"2026-03-18T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.427427 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.427513 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.427533 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.427557 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.427575 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:41Z","lastTransitionTime":"2026-03-18T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.530324 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.530400 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.530417 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.530441 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.530460 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:41Z","lastTransitionTime":"2026-03-18T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.633585 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.633665 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.633708 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.633741 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.633767 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:41Z","lastTransitionTime":"2026-03-18T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.737688 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.737750 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.737766 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.737790 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.737809 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:41Z","lastTransitionTime":"2026-03-18T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.784074 4756 generic.go:334] "Generic (PLEG): container finished" podID="c1c4e85b-faff-4aca-847c-f33570c542a1" containerID="354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04" exitCode=0 Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.784138 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" event={"ID":"c1c4e85b-faff-4aca-847c-f33570c542a1","Type":"ContainerDied","Data":"354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04"} Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.790822 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerStarted","Data":"d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd"} Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.816834 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.838375 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.840712 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.840757 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.840775 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.840794 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.840811 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:41Z","lastTransitionTime":"2026-03-18T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.872058 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.887325 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.911668 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.927857 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.943338 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.943375 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.943382 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.943398 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.943407 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:41Z","lastTransitionTime":"2026-03-18T14:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.948659 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.961406 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.974641 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.986599 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:41 crc kubenswrapper[4756]: I0318 14:01:41.997322 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.011251 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.025093 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.036032 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.044965 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.045001 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.045009 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.045043 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.045056 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:42Z","lastTransitionTime":"2026-03-18T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.147638 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.147673 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.147681 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.147693 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.147703 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:42Z","lastTransitionTime":"2026-03-18T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.198750 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.199028 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.199181 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:02:14.199099747 +0000 UTC m=+135.513517782 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.199187 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.199275 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.199296 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:02:14.199280292 +0000 UTC m=+135.513698307 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.199401 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.199481 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:02:14.199459727 +0000 UTC m=+135.513877742 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.250695 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.250756 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.250775 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.250807 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.250830 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:42Z","lastTransitionTime":"2026-03-18T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.300194 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.300301 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.300372 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.300409 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.300422 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.300484 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 14:02:14.30046563 +0000 UTC m=+135.614883615 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.300484 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.300516 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.300538 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.300611 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 14:02:14.300586334 +0000 UTC m=+135.615004349 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.314913 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.315023 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.314913 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.315249 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.315358 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:42 crc kubenswrapper[4756]: E0318 14:01:42.315534 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.353728 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.353768 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.353782 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.353798 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.353808 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:42Z","lastTransitionTime":"2026-03-18T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.456173 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.456213 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.456226 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.456243 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.456255 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:42Z","lastTransitionTime":"2026-03-18T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.558528 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.558587 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.558607 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.558631 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.558648 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:42Z","lastTransitionTime":"2026-03-18T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.661594 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.661641 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.661651 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.661667 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.661676 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:42Z","lastTransitionTime":"2026-03-18T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.764235 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.764275 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.764288 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.764305 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.764318 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:42Z","lastTransitionTime":"2026-03-18T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.798176 4756 generic.go:334] "Generic (PLEG): container finished" podID="c1c4e85b-faff-4aca-847c-f33570c542a1" containerID="09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb" exitCode=0 Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.798233 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" event={"ID":"c1c4e85b-faff-4aca-847c-f33570c542a1","Type":"ContainerDied","Data":"09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb"} Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.817060 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.833521 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.848893 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.866847 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.866921 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.866936 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.866958 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.866972 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:42Z","lastTransitionTime":"2026-03-18T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.875638 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.892934 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.907699 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.918784 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.934151 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.946210 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.960522 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.969165 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.969204 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.969213 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.969226 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.969237 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:42Z","lastTransitionTime":"2026-03-18T14:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.979811 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:42 crc kubenswrapper[4756]: I0318 14:01:42.995265 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:42Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.005009 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.005875 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-fg65c"] Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.006301 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fg65c" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.008775 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.009333 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.009669 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.009674 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.020012 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.030738 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.054708 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.069674 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.075163 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.075258 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.075274 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.075329 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.075345 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:43Z","lastTransitionTime":"2026-03-18T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.088047 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.096760 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.107133 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.107360 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvtfr\" (UniqueName: \"kubernetes.io/projected/dc241e7b-956f-4c3e-be3e-e239872d2c3b-kube-api-access-kvtfr\") pod \"node-ca-fg65c\" (UID: \"dc241e7b-956f-4c3e-be3e-e239872d2c3b\") " pod="openshift-image-registry/node-ca-fg65c" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.107447 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dc241e7b-956f-4c3e-be3e-e239872d2c3b-serviceca\") pod \"node-ca-fg65c\" (UID: \"dc241e7b-956f-4c3e-be3e-e239872d2c3b\") " pod="openshift-image-registry/node-ca-fg65c" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.107478 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc241e7b-956f-4c3e-be3e-e239872d2c3b-host\") pod \"node-ca-fg65c\" (UID: \"dc241e7b-956f-4c3e-be3e-e239872d2c3b\") " pod="openshift-image-registry/node-ca-fg65c" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.122394 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.140866 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.154935 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.176686 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.177491 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.177528 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.177544 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.177563 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.177578 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:43Z","lastTransitionTime":"2026-03-18T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.194185 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.208640 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvtfr\" (UniqueName: \"kubernetes.io/projected/dc241e7b-956f-4c3e-be3e-e239872d2c3b-kube-api-access-kvtfr\") pod \"node-ca-fg65c\" (UID: \"dc241e7b-956f-4c3e-be3e-e239872d2c3b\") " pod="openshift-image-registry/node-ca-fg65c" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.208759 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dc241e7b-956f-4c3e-be3e-e239872d2c3b-serviceca\") pod \"node-ca-fg65c\" (UID: \"dc241e7b-956f-4c3e-be3e-e239872d2c3b\") " pod="openshift-image-registry/node-ca-fg65c" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.208806 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc241e7b-956f-4c3e-be3e-e239872d2c3b-host\") pod \"node-ca-fg65c\" (UID: \"dc241e7b-956f-4c3e-be3e-e239872d2c3b\") " pod="openshift-image-registry/node-ca-fg65c" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.208879 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc241e7b-956f-4c3e-be3e-e239872d2c3b-host\") pod \"node-ca-fg65c\" (UID: \"dc241e7b-956f-4c3e-be3e-e239872d2c3b\") " pod="openshift-image-registry/node-ca-fg65c" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.209048 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.210418 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dc241e7b-956f-4c3e-be3e-e239872d2c3b-serviceca\") pod \"node-ca-fg65c\" (UID: \"dc241e7b-956f-4c3e-be3e-e239872d2c3b\") " pod="openshift-image-registry/node-ca-fg65c" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.223186 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.227270 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvtfr\" (UniqueName: \"kubernetes.io/projected/dc241e7b-956f-4c3e-be3e-e239872d2c3b-kube-api-access-kvtfr\") pod \"node-ca-fg65c\" (UID: \"dc241e7b-956f-4c3e-be3e-e239872d2c3b\") " pod="openshift-image-registry/node-ca-fg65c" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.243830 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.258558 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.280265 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.280289 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.280299 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.280314 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.280326 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:43Z","lastTransitionTime":"2026-03-18T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.384345 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.384411 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.384427 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.384457 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.384476 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:43Z","lastTransitionTime":"2026-03-18T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.417944 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fg65c" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.487011 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.487053 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.487068 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.487088 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.487102 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:43Z","lastTransitionTime":"2026-03-18T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.590247 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.590307 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.590326 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.590350 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.590368 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:43Z","lastTransitionTime":"2026-03-18T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.693195 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.693239 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.693251 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.693268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.693281 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:43Z","lastTransitionTime":"2026-03-18T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.795870 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.796185 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.796198 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.796217 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.796232 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:43Z","lastTransitionTime":"2026-03-18T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.801779 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fg65c" event={"ID":"dc241e7b-956f-4c3e-be3e-e239872d2c3b","Type":"ContainerStarted","Data":"1b9f5f77316cd143e2f347ba6b814a1ca72b51038bf19b5ec79492dbec2a6fcf"} Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.805203 4756 generic.go:334] "Generic (PLEG): container finished" podID="c1c4e85b-faff-4aca-847c-f33570c542a1" containerID="8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0" exitCode=0 Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.805273 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" event={"ID":"c1c4e85b-faff-4aca-847c-f33570c542a1","Type":"ContainerDied","Data":"8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0"} Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.813407 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerStarted","Data":"ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4"} Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.813613 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.813628 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.813695 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.821779 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.840523 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.840700 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.844582 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.861656 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.876722 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.888801 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.899026 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.899055 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.899062 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.899075 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.899083 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:43Z","lastTransitionTime":"2026-03-18T14:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.910201 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.923300 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.933885 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.942941 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.951669 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.961919 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.973313 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.986825 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:43 crc kubenswrapper[4756]: I0318 14:01:43.998856 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:43Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.001367 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.001417 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.001429 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.001449 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.001461 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:44Z","lastTransitionTime":"2026-03-18T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.015708 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.027835 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.040152 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.052965 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.068474 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.078575 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.088285 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.103720 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.103763 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.103775 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.103790 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.103800 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:44Z","lastTransitionTime":"2026-03-18T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.118461 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.145986 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.158536 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.166895 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.178223 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.188373 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.199595 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.206008 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.206187 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.206280 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.206380 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.206467 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:44Z","lastTransitionTime":"2026-03-18T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.209101 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.225300 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.309714 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.309775 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.309791 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.309818 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.309836 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:44Z","lastTransitionTime":"2026-03-18T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.315095 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.315110 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.315148 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:44 crc kubenswrapper[4756]: E0318 14:01:44.315531 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:44 crc kubenswrapper[4756]: E0318 14:01:44.315635 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:44 crc kubenswrapper[4756]: E0318 14:01:44.315343 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.412585 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.412685 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.412712 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.412745 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.412774 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:44Z","lastTransitionTime":"2026-03-18T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.516826 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.516878 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.516893 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.516915 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.516933 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:44Z","lastTransitionTime":"2026-03-18T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.619764 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.619832 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.619854 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.619883 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.619907 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:44Z","lastTransitionTime":"2026-03-18T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.723521 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.723561 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.723572 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.723585 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.723602 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:44Z","lastTransitionTime":"2026-03-18T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.819284 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fg65c" event={"ID":"dc241e7b-956f-4c3e-be3e-e239872d2c3b","Type":"ContainerStarted","Data":"f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86"} Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.823570 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" event={"ID":"c1c4e85b-faff-4aca-847c-f33570c542a1","Type":"ContainerStarted","Data":"db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f"} Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.825220 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.825318 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.825343 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.825372 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.825396 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:44Z","lastTransitionTime":"2026-03-18T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.839665 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.853040 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.873320 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.891253 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.905714 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.922816 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.931359 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.931398 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.931410 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.931426 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.931439 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:44Z","lastTransitionTime":"2026-03-18T14:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.941460 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.955031 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.970173 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.980977 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:44 crc kubenswrapper[4756]: I0318 14:01:44.991604 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:44Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.003025 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.012634 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.033578 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.033621 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.033633 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.033647 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.033655 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:45Z","lastTransitionTime":"2026-03-18T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.033749 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.048908 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.060596 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.077572 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.089644 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.100074 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.107297 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.116798 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.126868 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.135328 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.135364 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.135401 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.135415 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.135426 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:45Z","lastTransitionTime":"2026-03-18T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.139594 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.148424 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.163255 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.175386 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.188049 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.199144 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.210830 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.220964 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:45Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.238621 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.238674 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.238683 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.238701 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.238729 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:45Z","lastTransitionTime":"2026-03-18T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.340709 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.340759 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.340778 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.340799 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.340815 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:45Z","lastTransitionTime":"2026-03-18T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.443268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.443323 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.443335 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.443350 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.443364 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:45Z","lastTransitionTime":"2026-03-18T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.545786 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.545845 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.545860 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.545883 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.545900 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:45Z","lastTransitionTime":"2026-03-18T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.648290 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.648347 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.648361 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.648378 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.648390 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:45Z","lastTransitionTime":"2026-03-18T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.750969 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.751010 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.751026 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.751047 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.751063 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:45Z","lastTransitionTime":"2026-03-18T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.852706 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.852767 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.852784 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.852807 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.852826 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:45Z","lastTransitionTime":"2026-03-18T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.954677 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.954751 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.954774 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.954803 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:45 crc kubenswrapper[4756]: I0318 14:01:45.954825 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:45Z","lastTransitionTime":"2026-03-18T14:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.056712 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.056749 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.056758 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.056772 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.056783 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:46Z","lastTransitionTime":"2026-03-18T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.159180 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.159240 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.159253 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.159272 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.159285 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:46Z","lastTransitionTime":"2026-03-18T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.261642 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.261695 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.261712 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.261735 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.261752 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:46Z","lastTransitionTime":"2026-03-18T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.314664 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.314729 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.314782 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:46 crc kubenswrapper[4756]: E0318 14:01:46.314835 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:46 crc kubenswrapper[4756]: E0318 14:01:46.314949 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:46 crc kubenswrapper[4756]: E0318 14:01:46.315088 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.365100 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.365162 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.365171 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.365187 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.365198 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:46Z","lastTransitionTime":"2026-03-18T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.468803 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.469675 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.469742 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.469778 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.469801 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:46Z","lastTransitionTime":"2026-03-18T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.573685 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.573730 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.573742 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.573765 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.573789 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:46Z","lastTransitionTime":"2026-03-18T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.676588 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.676630 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.676642 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.676659 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.676669 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:46Z","lastTransitionTime":"2026-03-18T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.780169 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.780215 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.780228 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.780244 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.780260 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:46Z","lastTransitionTime":"2026-03-18T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.833341 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/0.log" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.836542 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerID="ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4" exitCode=1 Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.836594 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4"} Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.837738 4756 scope.go:117] "RemoveContainer" containerID="ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.856570 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:46Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.872594 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:46Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.883455 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.883534 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.883559 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.883595 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.883620 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:46Z","lastTransitionTime":"2026-03-18T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.888801 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:46Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.910676 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:46Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.923617 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:46Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.940855 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:46Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.951173 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:46Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.973490 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:46Z\\\",\\\"message\\\":\\\" removal\\\\nI0318 14:01:46.119563 6550 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 14:01:46.119586 6550 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 14:01:46.119597 6550 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:01:46.119603 6550 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:01:46.119622 6550 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:01:46.119630 6550 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:01:46.119637 6550 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 14:01:46.119645 6550 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:01:46.119661 6550 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 14:01:46.119743 6550 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:01:46.119780 6550 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:01:46.119802 6550 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 14:01:46.119814 6550 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:01:46.119826 6550 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:01:46.119829 6550 factory.go:656] Stopping watch factory\\\\nI0318 14:01:46.119847 6550 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:01:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:46Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.985604 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.985643 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.985653 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.985668 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.985679 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:46Z","lastTransitionTime":"2026-03-18T14:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:46 crc kubenswrapper[4756]: I0318 14:01:46.990676 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:46Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.008475 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.023048 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.034903 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.047068 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.057021 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.067870 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.087568 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.087609 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.087618 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.087631 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.087660 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:47Z","lastTransitionTime":"2026-03-18T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.189985 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.190328 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.190338 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.190354 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.190369 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:47Z","lastTransitionTime":"2026-03-18T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.292462 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.292501 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.292509 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.292524 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.292533 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:47Z","lastTransitionTime":"2026-03-18T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.394553 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.394612 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.394623 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.394637 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.394647 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:47Z","lastTransitionTime":"2026-03-18T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.496449 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.496487 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.496500 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.496516 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.496528 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:47Z","lastTransitionTime":"2026-03-18T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.599098 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.599179 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.599192 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.599209 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.599221 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:47Z","lastTransitionTime":"2026-03-18T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.701735 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.701808 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.701826 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.701851 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.701870 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:47Z","lastTransitionTime":"2026-03-18T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.804660 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.804702 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.804712 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.804731 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.804741 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:47Z","lastTransitionTime":"2026-03-18T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.841613 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/1.log" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.842550 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/0.log" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.846375 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerID="37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094" exitCode=1 Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.846421 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094"} Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.846459 4756 scope.go:117] "RemoveContainer" containerID="ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.847430 4756 scope.go:117] "RemoveContainer" containerID="37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094" Mar 18 14:01:47 crc kubenswrapper[4756]: E0318 14:01:47.847694 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.860039 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.872174 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.882085 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.894011 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.907613 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.907666 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.907682 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.907700 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.907712 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:47Z","lastTransitionTime":"2026-03-18T14:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.918020 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.930568 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.940880 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.950924 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.969376 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:46Z\\\",\\\"message\\\":\\\" removal\\\\nI0318 14:01:46.119563 6550 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 14:01:46.119586 6550 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 14:01:46.119597 6550 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:01:46.119603 6550 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:01:46.119622 6550 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:01:46.119630 6550 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:01:46.119637 6550 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 14:01:46.119645 6550 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:01:46.119661 6550 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 14:01:46.119743 6550 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:01:46.119780 6550 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:01:46.119802 6550 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 14:01:46.119814 6550 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:01:46.119826 6550 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:01:46.119829 6550 factory.go:656] Stopping watch factory\\\\nI0318 14:01:46.119847 6550 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:01:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:47Z\\\",\\\"message\\\":\\\"nsact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 14:01:47.754920 6712 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"ba175bbe-5cc4-47e6-a32d-57693e1320bd\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:47 crc kubenswrapper[4756]: I0318 14:01:47.987454 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.000826 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:47Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.010662 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.010707 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.010722 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.010743 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.010757 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:48Z","lastTransitionTime":"2026-03-18T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.016528 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:48Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.029720 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:48Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.042953 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:48Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.056187 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:48Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.113071 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.113183 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.113208 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.113237 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.113258 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:48Z","lastTransitionTime":"2026-03-18T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.216690 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.216743 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.216759 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.216784 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.216801 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:48Z","lastTransitionTime":"2026-03-18T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.315564 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.315573 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:48 crc kubenswrapper[4756]: E0318 14:01:48.315753 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.315585 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:48 crc kubenswrapper[4756]: E0318 14:01:48.315978 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:48 crc kubenswrapper[4756]: E0318 14:01:48.316071 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.325515 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.325584 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.325616 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.325652 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.325676 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:48Z","lastTransitionTime":"2026-03-18T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.429042 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.429164 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.429190 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.429215 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.429232 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:48Z","lastTransitionTime":"2026-03-18T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.531787 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.531872 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.531899 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.531935 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.531960 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:48Z","lastTransitionTime":"2026-03-18T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.634883 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.634945 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.634962 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.634987 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.635004 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:48Z","lastTransitionTime":"2026-03-18T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.738180 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.738249 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.738276 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.738310 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.738332 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:48Z","lastTransitionTime":"2026-03-18T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.841718 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.841778 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.841794 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.841816 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.841833 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:48Z","lastTransitionTime":"2026-03-18T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.852756 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/1.log" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.928022 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4"] Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.929245 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.931732 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.931763 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.945222 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:48Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.945418 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.945448 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.945462 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.945482 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.945495 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:48Z","lastTransitionTime":"2026-03-18T14:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.967083 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da217a0c-d8f3-4de1-b997-28d6683ede25-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mdjt4\" (UID: \"da217a0c-d8f3-4de1-b997-28d6683ede25\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.967151 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da217a0c-d8f3-4de1-b997-28d6683ede25-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mdjt4\" (UID: \"da217a0c-d8f3-4de1-b997-28d6683ede25\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.967193 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfbbw\" (UniqueName: \"kubernetes.io/projected/da217a0c-d8f3-4de1-b997-28d6683ede25-kube-api-access-jfbbw\") pod \"ovnkube-control-plane-749d76644c-mdjt4\" (UID: \"da217a0c-d8f3-4de1-b997-28d6683ede25\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.967432 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da217a0c-d8f3-4de1-b997-28d6683ede25-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mdjt4\" (UID: \"da217a0c-d8f3-4de1-b997-28d6683ede25\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.977250 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:48Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:48 crc kubenswrapper[4756]: I0318 14:01:48.996865 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:48Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.014663 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.027377 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.049001 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.049072 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.049092 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.049161 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.049203 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:49Z","lastTransitionTime":"2026-03-18T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.056598 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:46Z\\\",\\\"message\\\":\\\" removal\\\\nI0318 14:01:46.119563 6550 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 14:01:46.119586 6550 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 14:01:46.119597 6550 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:01:46.119603 6550 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:01:46.119622 6550 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:01:46.119630 6550 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:01:46.119637 6550 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 14:01:46.119645 6550 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:01:46.119661 6550 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 14:01:46.119743 6550 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:01:46.119780 6550 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:01:46.119802 6550 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 14:01:46.119814 6550 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:01:46.119826 6550 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:01:46.119829 6550 factory.go:656] Stopping watch factory\\\\nI0318 14:01:46.119847 6550 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:01:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:47Z\\\",\\\"message\\\":\\\"nsact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 14:01:47.754920 6712 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"ba175bbe-5cc4-47e6-a32d-57693e1320bd\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.068357 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da217a0c-d8f3-4de1-b997-28d6683ede25-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mdjt4\" (UID: \"da217a0c-d8f3-4de1-b997-28d6683ede25\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.068454 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da217a0c-d8f3-4de1-b997-28d6683ede25-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mdjt4\" (UID: \"da217a0c-d8f3-4de1-b997-28d6683ede25\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.068513 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfbbw\" (UniqueName: \"kubernetes.io/projected/da217a0c-d8f3-4de1-b997-28d6683ede25-kube-api-access-jfbbw\") pod \"ovnkube-control-plane-749d76644c-mdjt4\" (UID: \"da217a0c-d8f3-4de1-b997-28d6683ede25\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.068617 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da217a0c-d8f3-4de1-b997-28d6683ede25-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mdjt4\" (UID: \"da217a0c-d8f3-4de1-b997-28d6683ede25\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.069706 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da217a0c-d8f3-4de1-b997-28d6683ede25-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mdjt4\" (UID: \"da217a0c-d8f3-4de1-b997-28d6683ede25\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.069992 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da217a0c-d8f3-4de1-b997-28d6683ede25-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mdjt4\" (UID: \"da217a0c-d8f3-4de1-b997-28d6683ede25\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.074584 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da217a0c-d8f3-4de1-b997-28d6683ede25-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mdjt4\" (UID: \"da217a0c-d8f3-4de1-b997-28d6683ede25\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.077317 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.094457 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.095668 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfbbw\" (UniqueName: \"kubernetes.io/projected/da217a0c-d8f3-4de1-b997-28d6683ede25-kube-api-access-jfbbw\") pod \"ovnkube-control-plane-749d76644c-mdjt4\" (UID: \"da217a0c-d8f3-4de1-b997-28d6683ede25\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.114299 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.128897 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.149616 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.151487 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.151516 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.151544 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.151559 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.151570 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:49Z","lastTransitionTime":"2026-03-18T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.164137 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.177838 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.189447 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.205585 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.218901 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.249091 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.253665 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.253695 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.253704 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.253718 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.253731 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:49Z","lastTransitionTime":"2026-03-18T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:49 crc kubenswrapper[4756]: W0318 14:01:49.263340 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda217a0c_d8f3_4de1_b997_28d6683ede25.slice/crio-bb3f957b7e04e3abf63331a08971b40259dc6c4596363c434576c8f3229c393c WatchSource:0}: Error finding container bb3f957b7e04e3abf63331a08971b40259dc6c4596363c434576c8f3229c393c: Status 404 returned error can't find the container with id bb3f957b7e04e3abf63331a08971b40259dc6c4596363c434576c8f3229c393c Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.315161 4756 scope.go:117] "RemoveContainer" containerID="c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.330852 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.340945 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.355583 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.355610 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.355623 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.355640 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.355652 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:49Z","lastTransitionTime":"2026-03-18T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.365326 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:46Z\\\",\\\"message\\\":\\\" removal\\\\nI0318 14:01:46.119563 6550 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 14:01:46.119586 6550 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 14:01:46.119597 6550 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:01:46.119603 6550 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:01:46.119622 6550 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:01:46.119630 6550 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:01:46.119637 6550 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 14:01:46.119645 6550 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:01:46.119661 6550 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 14:01:46.119743 6550 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:01:46.119780 6550 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:01:46.119802 6550 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 14:01:46.119814 6550 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:01:46.119826 6550 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:01:46.119829 6550 factory.go:656] Stopping watch factory\\\\nI0318 14:01:46.119847 6550 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:01:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:47Z\\\",\\\"message\\\":\\\"nsact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 14:01:47.754920 6712 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"ba175bbe-5cc4-47e6-a32d-57693e1320bd\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.380597 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.393972 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.411284 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.425345 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.436255 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.446417 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.456914 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.458472 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.458496 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.458503 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.458515 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.458524 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:49Z","lastTransitionTime":"2026-03-18T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.467096 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.479782 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.491501 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.505874 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.525271 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.538097 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.560663 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.560697 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.560706 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.560720 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.560730 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:49Z","lastTransitionTime":"2026-03-18T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.663920 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.663964 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.663975 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.663992 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.664003 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:49Z","lastTransitionTime":"2026-03-18T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.688926 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-gfdtl"] Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.690227 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:49 crc kubenswrapper[4756]: E0318 14:01:49.690300 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.712394 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.734793 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:46Z\\\",\\\"message\\\":\\\" removal\\\\nI0318 14:01:46.119563 6550 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 14:01:46.119586 6550 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 14:01:46.119597 6550 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:01:46.119603 6550 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:01:46.119622 6550 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:01:46.119630 6550 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:01:46.119637 6550 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 14:01:46.119645 6550 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:01:46.119661 6550 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 14:01:46.119743 6550 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:01:46.119780 6550 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:01:46.119802 6550 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 14:01:46.119814 6550 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:01:46.119826 6550 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:01:46.119829 6550 factory.go:656] Stopping watch factory\\\\nI0318 14:01:46.119847 6550 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:01:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:47Z\\\",\\\"message\\\":\\\"nsact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 14:01:47.754920 6712 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"ba175bbe-5cc4-47e6-a32d-57693e1320bd\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.748531 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.762476 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.765763 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.765804 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.765816 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.765833 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.765845 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:49Z","lastTransitionTime":"2026-03-18T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.774910 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtqtk\" (UniqueName: \"kubernetes.io/projected/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-kube-api-access-mtqtk\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.774961 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.777980 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.792554 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.804953 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.819449 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.831508 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.846075 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.856912 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.862084 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" event={"ID":"da217a0c-d8f3-4de1-b997-28d6683ede25","Type":"ContainerStarted","Data":"fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5"} Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.862143 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" event={"ID":"da217a0c-d8f3-4de1-b997-28d6683ede25","Type":"ContainerStarted","Data":"74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0"} Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.862160 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" event={"ID":"da217a0c-d8f3-4de1-b997-28d6683ede25","Type":"ContainerStarted","Data":"bb3f957b7e04e3abf63331a08971b40259dc6c4596363c434576c8f3229c393c"} Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.863745 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.865311 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8"} Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.865664 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.867458 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.867486 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.867498 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.867513 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.867525 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:49Z","lastTransitionTime":"2026-03-18T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.871451 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.875377 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtqtk\" (UniqueName: \"kubernetes.io/projected/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-kube-api-access-mtqtk\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.875418 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:49 crc kubenswrapper[4756]: E0318 14:01:49.875517 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:01:49 crc kubenswrapper[4756]: E0318 14:01:49.875565 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs podName:c11d2088-741c-4812-8eb2-ccfc3d0c7d11 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:50.375548816 +0000 UTC m=+111.689966791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs") pod "network-metrics-daemon-gfdtl" (UID: "c11d2088-741c-4812-8eb2-ccfc3d0c7d11") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.880034 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.889853 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.891915 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtqtk\" (UniqueName: \"kubernetes.io/projected/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-kube-api-access-mtqtk\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.907202 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.919841 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.931103 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.941898 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.955078 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.966539 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.972214 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.972242 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.972252 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.972268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.972278 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:49Z","lastTransitionTime":"2026-03-18T14:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.978610 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:49 crc kubenswrapper[4756]: I0318 14:01:49.995502 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:49Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.005431 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.016272 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.027849 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.046259 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.061293 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.075041 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.075074 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.075083 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.075096 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.075105 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.095138 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.107394 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.121537 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.131626 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.149623 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:46Z\\\",\\\"message\\\":\\\" removal\\\\nI0318 14:01:46.119563 6550 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 14:01:46.119586 6550 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 14:01:46.119597 6550 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:01:46.119603 6550 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:01:46.119622 6550 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:01:46.119630 6550 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:01:46.119637 6550 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 14:01:46.119645 6550 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:01:46.119661 6550 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 14:01:46.119743 6550 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:01:46.119780 6550 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:01:46.119802 6550 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 14:01:46.119814 6550 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:01:46.119826 6550 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:01:46.119829 6550 factory.go:656] Stopping watch factory\\\\nI0318 14:01:46.119847 6550 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:01:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:47Z\\\",\\\"message\\\":\\\"nsact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 14:01:47.754920 6712 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"ba175bbe-5cc4-47e6-a32d-57693e1320bd\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.160690 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.171404 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.177037 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.177068 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.177080 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.177095 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.177105 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.264792 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.264831 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.264840 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.264854 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.264864 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:50 crc kubenswrapper[4756]: E0318 14:01:50.281541 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.285559 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.285601 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.285610 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.285624 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.285634 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:50 crc kubenswrapper[4756]: E0318 14:01:50.298187 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.301793 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.301834 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.301845 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.301860 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.301870 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:50 crc kubenswrapper[4756]: E0318 14:01:50.313424 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.314595 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.314648 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:50 crc kubenswrapper[4756]: E0318 14:01:50.314691 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.314716 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:50 crc kubenswrapper[4756]: E0318 14:01:50.314798 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:50 crc kubenswrapper[4756]: E0318 14:01:50.314916 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.317549 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.317593 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.317608 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.317630 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.317646 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:50 crc kubenswrapper[4756]: E0318 14:01:50.329324 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.333243 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.333268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.333278 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.333294 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.333306 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:50 crc kubenswrapper[4756]: E0318 14:01:50.351720 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:50Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:50 crc kubenswrapper[4756]: E0318 14:01:50.351894 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.353546 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.353584 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.353597 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.353615 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.353629 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.379388 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:50 crc kubenswrapper[4756]: E0318 14:01:50.379624 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:01:50 crc kubenswrapper[4756]: E0318 14:01:50.379757 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs podName:c11d2088-741c-4812-8eb2-ccfc3d0c7d11 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:51.379729612 +0000 UTC m=+112.694147667 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs") pod "network-metrics-daemon-gfdtl" (UID: "c11d2088-741c-4812-8eb2-ccfc3d0c7d11") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.456573 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.456648 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.456665 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.456688 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.456706 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.558796 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.558856 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.558870 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.558887 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.558898 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.662517 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.662573 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.662590 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.662618 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.662676 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.766555 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.766620 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.766635 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.766662 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.766679 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.869796 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.869882 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.869902 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.870349 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.870465 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.973735 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.973793 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.973803 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.973823 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:50 crc kubenswrapper[4756]: I0318 14:01:50.973837 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:50Z","lastTransitionTime":"2026-03-18T14:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.077843 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.077897 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.077911 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.077933 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.077947 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:51Z","lastTransitionTime":"2026-03-18T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.181357 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.181421 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.181435 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.181459 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.181476 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:51Z","lastTransitionTime":"2026-03-18T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.285396 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.285446 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.285457 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.285544 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.285559 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:51Z","lastTransitionTime":"2026-03-18T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.315409 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:51 crc kubenswrapper[4756]: E0318 14:01:51.315549 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.388610 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.388658 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.388667 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.388684 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.388694 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:51Z","lastTransitionTime":"2026-03-18T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.391437 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:51 crc kubenswrapper[4756]: E0318 14:01:51.391583 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:01:51 crc kubenswrapper[4756]: E0318 14:01:51.391636 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs podName:c11d2088-741c-4812-8eb2-ccfc3d0c7d11 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:53.391623678 +0000 UTC m=+114.706041653 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs") pod "network-metrics-daemon-gfdtl" (UID: "c11d2088-741c-4812-8eb2-ccfc3d0c7d11") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.492141 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.492188 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.492196 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.492216 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.492225 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:51Z","lastTransitionTime":"2026-03-18T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.594726 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.594801 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.594825 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.594884 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.594901 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:51Z","lastTransitionTime":"2026-03-18T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.697780 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.697848 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.697865 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.697892 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.697912 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:51Z","lastTransitionTime":"2026-03-18T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.800493 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.800577 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.800595 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.800617 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.800632 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:51Z","lastTransitionTime":"2026-03-18T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.903187 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.903225 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.903237 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.903253 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:51 crc kubenswrapper[4756]: I0318 14:01:51.903264 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:51Z","lastTransitionTime":"2026-03-18T14:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.006209 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.006245 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.006257 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.006274 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.006285 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:52Z","lastTransitionTime":"2026-03-18T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.109031 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.109163 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.109184 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.109209 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.109228 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:52Z","lastTransitionTime":"2026-03-18T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.211457 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.211533 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.211556 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.211586 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.211609 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:52Z","lastTransitionTime":"2026-03-18T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.314426 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.314456 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.314450 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.314595 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.314644 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:52 crc kubenswrapper[4756]: E0318 14:01:52.314596 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.314669 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.314757 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:52 crc kubenswrapper[4756]: E0318 14:01:52.314753 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.314783 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:52Z","lastTransitionTime":"2026-03-18T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:52 crc kubenswrapper[4756]: E0318 14:01:52.314875 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.417227 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.417316 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.417353 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.417389 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.417412 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:52Z","lastTransitionTime":"2026-03-18T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.521076 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.521149 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.521168 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.521187 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.521199 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:52Z","lastTransitionTime":"2026-03-18T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.624004 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.624078 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.624098 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.624167 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.624198 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:52Z","lastTransitionTime":"2026-03-18T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.727661 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.727713 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.727722 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.727740 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.727752 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:52Z","lastTransitionTime":"2026-03-18T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.830830 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.830888 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.830903 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.830927 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.830941 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:52Z","lastTransitionTime":"2026-03-18T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.934630 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.934698 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.934716 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.934742 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:52 crc kubenswrapper[4756]: I0318 14:01:52.934759 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:52Z","lastTransitionTime":"2026-03-18T14:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.038354 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.038440 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.038462 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.038493 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.038516 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:53Z","lastTransitionTime":"2026-03-18T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.141684 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.141755 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.141773 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.141798 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.141816 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:53Z","lastTransitionTime":"2026-03-18T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.244743 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.244795 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.244807 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.244826 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.244840 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:53Z","lastTransitionTime":"2026-03-18T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.314661 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:53 crc kubenswrapper[4756]: E0318 14:01:53.314878 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.355869 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.355977 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.356028 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.356056 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.356108 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:53Z","lastTransitionTime":"2026-03-18T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.416785 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:53 crc kubenswrapper[4756]: E0318 14:01:53.416936 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:01:53 crc kubenswrapper[4756]: E0318 14:01:53.416989 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs podName:c11d2088-741c-4812-8eb2-ccfc3d0c7d11 nodeName:}" failed. No retries permitted until 2026-03-18 14:01:57.416975646 +0000 UTC m=+118.731393621 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs") pod "network-metrics-daemon-gfdtl" (UID: "c11d2088-741c-4812-8eb2-ccfc3d0c7d11") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.459490 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.459552 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.459570 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.459592 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.459609 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:53Z","lastTransitionTime":"2026-03-18T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.563322 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.563387 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.563404 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.563432 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.563450 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:53Z","lastTransitionTime":"2026-03-18T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.667053 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.667180 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.667209 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.667236 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.667253 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:53Z","lastTransitionTime":"2026-03-18T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.770584 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.770670 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.770688 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.770710 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.770727 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:53Z","lastTransitionTime":"2026-03-18T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.874490 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.874580 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.874599 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.874624 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.874644 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:53Z","lastTransitionTime":"2026-03-18T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.977410 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.977481 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.977501 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.977524 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:53 crc kubenswrapper[4756]: I0318 14:01:53.977541 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:53Z","lastTransitionTime":"2026-03-18T14:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.080739 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.080788 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.080804 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.080824 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.080841 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:54Z","lastTransitionTime":"2026-03-18T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.183608 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.183657 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.183670 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.183693 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.183706 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:54Z","lastTransitionTime":"2026-03-18T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.285976 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.286033 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.286046 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.286064 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.286076 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:54Z","lastTransitionTime":"2026-03-18T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.314583 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.314628 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.314749 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:54 crc kubenswrapper[4756]: E0318 14:01:54.314746 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:54 crc kubenswrapper[4756]: E0318 14:01:54.314909 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:54 crc kubenswrapper[4756]: E0318 14:01:54.314986 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.390263 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.390325 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.390344 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.390369 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.390386 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:54Z","lastTransitionTime":"2026-03-18T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.493813 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.493851 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.493860 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.493874 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.493884 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:54Z","lastTransitionTime":"2026-03-18T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.595577 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.595646 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.595668 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.595733 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.595755 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:54Z","lastTransitionTime":"2026-03-18T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.699028 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.699072 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.699086 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.699104 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.699143 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:54Z","lastTransitionTime":"2026-03-18T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.801382 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.801454 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.801472 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.801505 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.801528 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:54Z","lastTransitionTime":"2026-03-18T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.904272 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.904358 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.904385 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.904415 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:54 crc kubenswrapper[4756]: I0318 14:01:54.904442 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:54Z","lastTransitionTime":"2026-03-18T14:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.007774 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.007817 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.007828 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.007850 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.007863 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:55Z","lastTransitionTime":"2026-03-18T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.111627 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.111925 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.111957 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.111990 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.112014 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:55Z","lastTransitionTime":"2026-03-18T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.215400 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.215445 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.215455 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.215495 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.215513 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:55Z","lastTransitionTime":"2026-03-18T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.314777 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:55 crc kubenswrapper[4756]: E0318 14:01:55.315005 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.318482 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.318553 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.318571 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.318597 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.318621 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:55Z","lastTransitionTime":"2026-03-18T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.422419 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.422477 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.422495 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.422519 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.422537 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:55Z","lastTransitionTime":"2026-03-18T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.525596 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.525685 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.525717 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.525752 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.525779 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:55Z","lastTransitionTime":"2026-03-18T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.629336 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.629385 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.629400 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.629417 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.629429 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:55Z","lastTransitionTime":"2026-03-18T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.732471 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.732538 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.732562 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.732592 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.732619 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:55Z","lastTransitionTime":"2026-03-18T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.836640 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.836744 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.836770 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.836805 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.836833 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:55Z","lastTransitionTime":"2026-03-18T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.940460 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.940506 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.940515 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.940529 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:55 crc kubenswrapper[4756]: I0318 14:01:55.940539 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:55Z","lastTransitionTime":"2026-03-18T14:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.043512 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.043556 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.043567 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.043581 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.043590 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:56Z","lastTransitionTime":"2026-03-18T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.147745 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.147814 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.147836 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.147865 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.147885 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:56Z","lastTransitionTime":"2026-03-18T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.251523 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.251605 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.251627 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.251650 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.251666 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:56Z","lastTransitionTime":"2026-03-18T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.314774 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.314855 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.314781 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:56 crc kubenswrapper[4756]: E0318 14:01:56.315036 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:56 crc kubenswrapper[4756]: E0318 14:01:56.315186 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:56 crc kubenswrapper[4756]: E0318 14:01:56.315326 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.354782 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.354896 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.354930 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.354963 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.354983 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:56Z","lastTransitionTime":"2026-03-18T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.458202 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.458261 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.458281 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.458303 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.458318 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:56Z","lastTransitionTime":"2026-03-18T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.561783 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.561846 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.561869 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.561901 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.561923 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:56Z","lastTransitionTime":"2026-03-18T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.665029 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.665098 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.665145 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.665171 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.665189 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:56Z","lastTransitionTime":"2026-03-18T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.768573 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.768641 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.768661 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.768685 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.768702 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:56Z","lastTransitionTime":"2026-03-18T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.871774 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.871845 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.871868 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.871925 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.871951 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:56Z","lastTransitionTime":"2026-03-18T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.975142 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.975213 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.975233 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.975268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:56 crc kubenswrapper[4756]: I0318 14:01:56.975308 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:56Z","lastTransitionTime":"2026-03-18T14:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.078992 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.079073 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.079096 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.079166 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.079192 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:57Z","lastTransitionTime":"2026-03-18T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.181905 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.181969 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.181985 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.182011 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.182028 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:57Z","lastTransitionTime":"2026-03-18T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.285611 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.285681 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.285710 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.285741 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.285766 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:57Z","lastTransitionTime":"2026-03-18T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.315298 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:57 crc kubenswrapper[4756]: E0318 14:01:57.315524 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.389256 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.389296 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.389304 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.389319 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.389327 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:57Z","lastTransitionTime":"2026-03-18T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.462636 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:57 crc kubenswrapper[4756]: E0318 14:01:57.462811 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:01:57 crc kubenswrapper[4756]: E0318 14:01:57.462893 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs podName:c11d2088-741c-4812-8eb2-ccfc3d0c7d11 nodeName:}" failed. No retries permitted until 2026-03-18 14:02:05.462867719 +0000 UTC m=+126.777285704 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs") pod "network-metrics-daemon-gfdtl" (UID: "c11d2088-741c-4812-8eb2-ccfc3d0c7d11") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.493252 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.493295 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.493309 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.493329 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.493344 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:57Z","lastTransitionTime":"2026-03-18T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.596441 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.596527 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.596548 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.596573 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.596592 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:57Z","lastTransitionTime":"2026-03-18T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.700307 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.700380 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.700405 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.700435 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.700461 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:57Z","lastTransitionTime":"2026-03-18T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.804047 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.804113 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.804183 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.804214 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.804236 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:57Z","lastTransitionTime":"2026-03-18T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.907593 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.907675 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.907870 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.907895 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:57 crc kubenswrapper[4756]: I0318 14:01:57.907913 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:57Z","lastTransitionTime":"2026-03-18T14:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.011083 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.011208 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.011226 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.011264 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.011284 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:58Z","lastTransitionTime":"2026-03-18T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.113726 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.114097 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.114145 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.114172 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.114191 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:58Z","lastTransitionTime":"2026-03-18T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.222199 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.222251 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.222268 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.222291 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.222309 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:58Z","lastTransitionTime":"2026-03-18T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.315256 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.315378 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.315683 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:01:58 crc kubenswrapper[4756]: E0318 14:01:58.315831 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:01:58 crc kubenswrapper[4756]: E0318 14:01:58.315951 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:01:58 crc kubenswrapper[4756]: E0318 14:01:58.316063 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.325276 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.325345 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.325366 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.325392 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.325409 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:58Z","lastTransitionTime":"2026-03-18T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.330515 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.428475 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.428554 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.428572 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.428596 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.428616 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:58Z","lastTransitionTime":"2026-03-18T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.532219 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.532282 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.532300 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.532323 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.532404 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:58Z","lastTransitionTime":"2026-03-18T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.635208 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.635274 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.635296 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.635325 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.635348 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:58Z","lastTransitionTime":"2026-03-18T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.738318 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.738393 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.738422 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.738446 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.738462 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:58Z","lastTransitionTime":"2026-03-18T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.841571 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.841647 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.841709 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.841739 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.841761 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:58Z","lastTransitionTime":"2026-03-18T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.944170 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.944215 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.944227 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.944244 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:58 crc kubenswrapper[4756]: I0318 14:01:58.944257 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:58Z","lastTransitionTime":"2026-03-18T14:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.047555 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.047618 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.047635 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.047659 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.047679 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:59Z","lastTransitionTime":"2026-03-18T14:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.150529 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.150571 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.150582 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.150599 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.150611 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:59Z","lastTransitionTime":"2026-03-18T14:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.183299 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.205297 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.227424 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.253531 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.253607 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.253629 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.253659 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.253681 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:01:59Z","lastTransitionTime":"2026-03-18T14:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.254324 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.273479 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.301964 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:46Z\\\",\\\"message\\\":\\\" removal\\\\nI0318 14:01:46.119563 6550 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 14:01:46.119586 6550 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 14:01:46.119597 6550 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:01:46.119603 6550 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:01:46.119622 6550 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:01:46.119630 6550 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:01:46.119637 6550 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 14:01:46.119645 6550 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:01:46.119661 6550 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 14:01:46.119743 6550 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:01:46.119780 6550 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:01:46.119802 6550 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 14:01:46.119814 6550 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:01:46.119826 6550 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:01:46.119829 6550 factory.go:656] Stopping watch factory\\\\nI0318 14:01:46.119847 6550 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:01:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:47Z\\\",\\\"message\\\":\\\"nsact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 14:01:47.754920 6712 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"ba175bbe-5cc4-47e6-a32d-57693e1320bd\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.314429 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:01:59 crc kubenswrapper[4756]: E0318 14:01:59.314639 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.322022 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.339955 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: E0318 14:01:59.354646 4756 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.355702 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.367720 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.382368 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.397228 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: E0318 14:01:59.408103 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.411599 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.448235 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.461088 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.473695 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.485201 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.496002 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.505441 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.519028 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.530232 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.540940 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.555896 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.567445 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.585672 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac35b98598b4e580e38cb84770bcab25478fd1fd804053a5d123f49e9fbcbcb4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:46Z\\\",\\\"message\\\":\\\" removal\\\\nI0318 14:01:46.119563 6550 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 14:01:46.119586 6550 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 14:01:46.119597 6550 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:01:46.119603 6550 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:01:46.119622 6550 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:01:46.119630 6550 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:01:46.119637 6550 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 14:01:46.119645 6550 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:01:46.119661 6550 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0318 14:01:46.119743 6550 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:01:46.119780 6550 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:01:46.119802 6550 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 14:01:46.119814 6550 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:01:46.119826 6550 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:01:46.119829 6550 factory.go:656] Stopping watch factory\\\\nI0318 14:01:46.119847 6550 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:01:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:47Z\\\",\\\"message\\\":\\\"nsact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 14:01:47.754920 6712 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"ba175bbe-5cc4-47e6-a32d-57693e1320bd\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.599425 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.609895 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.619964 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.628495 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.640349 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.651992 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.664246 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.684641 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.699686 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.714888 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.725518 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:01:59 crc kubenswrapper[4756]: I0318 14:01:59.737938 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:01:59Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.314613 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.314618 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.314685 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:00 crc kubenswrapper[4756]: E0318 14:02:00.315384 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.315538 4756 scope.go:117] "RemoveContainer" containerID="37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094" Mar 18 14:02:00 crc kubenswrapper[4756]: E0318 14:02:00.315526 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:00 crc kubenswrapper[4756]: E0318 14:02:00.315674 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.334812 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.354442 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.371919 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.393630 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:47Z\\\",\\\"message\\\":\\\"nsact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 14:01:47.754920 6712 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"ba175bbe-5cc4-47e6-a32d-57693e1320bd\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.406906 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.421348 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.432399 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.443752 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.463938 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.478939 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.489795 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.504496 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.506750 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.506783 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.506792 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.506807 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.506822 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:00Z","lastTransitionTime":"2026-03-18T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.521074 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: E0318 14:02:00.528671 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.531166 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.531802 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.531831 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.531842 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.531861 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.531872 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:00Z","lastTransitionTime":"2026-03-18T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.541690 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: E0318 14:02:00.545411 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.548270 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.548303 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.548313 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.548329 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.548338 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:00Z","lastTransitionTime":"2026-03-18T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.556547 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: E0318 14:02:00.562233 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.565183 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.565209 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.565220 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.565233 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.565242 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:00Z","lastTransitionTime":"2026-03-18T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.567696 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: E0318 14:02:00.578225 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.581050 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.581082 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.581089 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.581101 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.581109 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:00Z","lastTransitionTime":"2026-03-18T14:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.585752 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: E0318 14:02:00.593532 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: E0318 14:02:00.593646 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.908200 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/1.log" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.911351 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerStarted","Data":"9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e"} Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.911896 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.925620 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.936220 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.950413 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.962826 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:00 crc kubenswrapper[4756]: I0318 14:02:00.976944 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.001345 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:47Z\\\",\\\"message\\\":\\\"nsact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 14:01:47.754920 6712 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"ba175bbe-5cc4-47e6-a32d-57693e1320bd\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:00Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.016163 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.034946 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.056093 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.072748 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.105239 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.122698 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.135845 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.161359 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.176527 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.187092 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.201237 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.209806 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.314875 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:01 crc kubenswrapper[4756]: E0318 14:02:01.315715 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.325784 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.917400 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/2.log" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.918494 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/1.log" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.922202 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerID="9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e" exitCode=1 Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.922306 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e"} Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.922367 4756 scope.go:117] "RemoveContainer" containerID="37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.923384 4756 scope.go:117] "RemoveContainer" containerID="9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e" Mar 18 14:02:01 crc kubenswrapper[4756]: E0318 14:02:01.923667 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.939018 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.962573 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.979854 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:01 crc kubenswrapper[4756]: I0318 14:02:01.995055 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:01Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.010460 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.026678 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.044977 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.062093 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.079332 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.094786 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.107070 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.129641 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37476ae6fafc7e9d1ded4a8934f7471ce6c15986cb2d2c56c1544058e3ea6094\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:01:47Z\\\",\\\"message\\\":\\\"nsact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 14:01:47.754920 6712 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"ba175bbe-5cc4-47e6-a32d-57693e1320bd\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, b\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:01Z\\\",\\\"message\\\":\\\"actory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132317 6977 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:02:01.132336 6977 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:02:01.132394 6977 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:02:01.132464 6977 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:02:01.132631 6977 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0318 14:02:01.132684 6977 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:02:01.132719 6977 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:02:01.132734 6977 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 14:02:01.132750 6977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 14:02:01.132795 6977 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:02:01.132834 6977 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:02:01.132803 6977 factory.go:656] Stopping watch factory\\\\nI0318 14:02:01.132873 6977 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:02:01.132831 6977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132923 6977 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 14:02:01.133034 6977 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:02:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.148547 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.168942 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.189077 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.204935 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.222229 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b96e094-221e-4549-bdcb-010e5d2e1346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a7ae4a678ad496f072b4a15ec21271439efe888a3dcaeab650918b804f3df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9ab2bc4932c031750af26b1bee4bba55196c4a3f57252597753681ae34d37a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f5c994aeb269fd08691355c96c16c3f714e3dca2d9b8329015e9d32d4e1ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.241825 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.255832 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.315066 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.315105 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.315108 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:02 crc kubenswrapper[4756]: E0318 14:02:02.315348 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:02 crc kubenswrapper[4756]: E0318 14:02:02.315458 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:02 crc kubenswrapper[4756]: E0318 14:02:02.315580 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.927211 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/2.log" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.930898 4756 scope.go:117] "RemoveContainer" containerID="9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e" Mar 18 14:02:02 crc kubenswrapper[4756]: E0318 14:02:02.931093 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.944639 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b96e094-221e-4549-bdcb-010e5d2e1346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a7ae4a678ad496f072b4a15ec21271439efe888a3dcaeab650918b804f3df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9ab2bc4932c031750af26b1bee4bba55196c4a3f57252597753681ae34d37a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f5c994aeb269fd08691355c96c16c3f714e3dca2d9b8329015e9d32d4e1ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.963793 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.976663 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:02 crc kubenswrapper[4756]: I0318 14:02:02.990184 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:02Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.015798 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.036400 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.055162 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.070327 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.079760 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.091724 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.105183 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.117719 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.132301 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.144428 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.167041 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:01Z\\\",\\\"message\\\":\\\"actory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132317 6977 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:02:01.132336 6977 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:02:01.132394 6977 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:02:01.132464 6977 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:02:01.132631 6977 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0318 14:02:01.132684 6977 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:02:01.132719 6977 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:02:01.132734 6977 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 14:02:01.132750 6977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 14:02:01.132795 6977 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:02:01.132834 6977 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:02:01.132803 6977 factory.go:656] Stopping watch factory\\\\nI0318 14:02:01.132873 6977 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:02:01.132831 6977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132923 6977 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 14:02:01.133034 6977 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:02:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.182674 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.196436 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.211509 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.227520 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:03Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:03 crc kubenswrapper[4756]: I0318 14:02:03.315404 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:03 crc kubenswrapper[4756]: E0318 14:02:03.315608 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:04 crc kubenswrapper[4756]: I0318 14:02:04.315371 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:04 crc kubenswrapper[4756]: I0318 14:02:04.315396 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:04 crc kubenswrapper[4756]: I0318 14:02:04.315576 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:04 crc kubenswrapper[4756]: E0318 14:02:04.315736 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:04 crc kubenswrapper[4756]: E0318 14:02:04.315908 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:04 crc kubenswrapper[4756]: E0318 14:02:04.316239 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:04 crc kubenswrapper[4756]: E0318 14:02:04.409472 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:02:05 crc kubenswrapper[4756]: I0318 14:02:05.314779 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:05 crc kubenswrapper[4756]: E0318 14:02:05.315032 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:05 crc kubenswrapper[4756]: I0318 14:02:05.554932 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:05 crc kubenswrapper[4756]: E0318 14:02:05.555055 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:02:05 crc kubenswrapper[4756]: E0318 14:02:05.555140 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs podName:c11d2088-741c-4812-8eb2-ccfc3d0c7d11 nodeName:}" failed. No retries permitted until 2026-03-18 14:02:21.555101446 +0000 UTC m=+142.869519421 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs") pod "network-metrics-daemon-gfdtl" (UID: "c11d2088-741c-4812-8eb2-ccfc3d0c7d11") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:02:06 crc kubenswrapper[4756]: I0318 14:02:06.314670 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:06 crc kubenswrapper[4756]: I0318 14:02:06.314742 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:06 crc kubenswrapper[4756]: E0318 14:02:06.314823 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:06 crc kubenswrapper[4756]: I0318 14:02:06.314858 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:06 crc kubenswrapper[4756]: E0318 14:02:06.315037 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:06 crc kubenswrapper[4756]: E0318 14:02:06.315322 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:07 crc kubenswrapper[4756]: I0318 14:02:07.314709 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:07 crc kubenswrapper[4756]: E0318 14:02:07.314959 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:08 crc kubenswrapper[4756]: I0318 14:02:08.314569 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:08 crc kubenswrapper[4756]: I0318 14:02:08.314615 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:08 crc kubenswrapper[4756]: E0318 14:02:08.314758 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:08 crc kubenswrapper[4756]: I0318 14:02:08.314800 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:08 crc kubenswrapper[4756]: E0318 14:02:08.315063 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:08 crc kubenswrapper[4756]: E0318 14:02:08.315251 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.314893 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:09 crc kubenswrapper[4756]: E0318 14:02:09.315372 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.339862 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.359430 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.381000 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.399693 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: E0318 14:02:09.410285 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.423682 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b96e094-221e-4549-bdcb-010e5d2e1346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a7ae4a678ad496f072b4a15ec21271439efe888a3dcaeab650918b804f3df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9ab2bc4932c031750af26b1bee4bba55196c4a3f57252597753681ae34d37a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f5c994aeb269fd08691355c96c16c3f714e3dca2d9b8329015e9d32d4e1ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.443267 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.458455 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.473249 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.509319 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.536791 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.571519 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.594328 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.609274 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.625840 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.642266 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.659373 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.676355 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.693782 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:09 crc kubenswrapper[4756]: I0318 14:02:09.713441 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:01Z\\\",\\\"message\\\":\\\"actory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132317 6977 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:02:01.132336 6977 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:02:01.132394 6977 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:02:01.132464 6977 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:02:01.132631 6977 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0318 14:02:01.132684 6977 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:02:01.132719 6977 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:02:01.132734 6977 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 14:02:01.132750 6977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 14:02:01.132795 6977 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:02:01.132834 6977 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:02:01.132803 6977 factory.go:656] Stopping watch factory\\\\nI0318 14:02:01.132873 6977 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:02:01.132831 6977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132923 6977 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 14:02:01.133034 6977 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:02:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:09Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.314817 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.314878 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:10 crc kubenswrapper[4756]: E0318 14:02:10.315063 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.315456 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:10 crc kubenswrapper[4756]: E0318 14:02:10.315516 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:10 crc kubenswrapper[4756]: E0318 14:02:10.315582 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.874386 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.874432 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.874444 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.874460 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.874471 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:10Z","lastTransitionTime":"2026-03-18T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:10 crc kubenswrapper[4756]: E0318 14:02:10.889466 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:10Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.893571 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.893609 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.893645 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.893664 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.893677 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:10Z","lastTransitionTime":"2026-03-18T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:10 crc kubenswrapper[4756]: E0318 14:02:10.908104 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:10Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.911422 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.911470 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.911481 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.911497 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.911510 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:10Z","lastTransitionTime":"2026-03-18T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:10 crc kubenswrapper[4756]: E0318 14:02:10.923776 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:10Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.927281 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.927307 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.927318 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.927333 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.927345 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:10Z","lastTransitionTime":"2026-03-18T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:10 crc kubenswrapper[4756]: E0318 14:02:10.937853 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:10Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.941941 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.941989 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.942003 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.942022 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:10 crc kubenswrapper[4756]: I0318 14:02:10.942034 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:10Z","lastTransitionTime":"2026-03-18T14:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:10 crc kubenswrapper[4756]: E0318 14:02:10.954660 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:10Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:10 crc kubenswrapper[4756]: E0318 14:02:10.954809 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 14:02:11 crc kubenswrapper[4756]: I0318 14:02:11.315311 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:11 crc kubenswrapper[4756]: E0318 14:02:11.315616 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:12 crc kubenswrapper[4756]: I0318 14:02:12.315344 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:12 crc kubenswrapper[4756]: I0318 14:02:12.315394 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:12 crc kubenswrapper[4756]: E0318 14:02:12.315472 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:12 crc kubenswrapper[4756]: E0318 14:02:12.315559 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:12 crc kubenswrapper[4756]: I0318 14:02:12.316223 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:12 crc kubenswrapper[4756]: E0318 14:02:12.316443 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:13 crc kubenswrapper[4756]: I0318 14:02:13.314507 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:13 crc kubenswrapper[4756]: E0318 14:02:13.315028 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:13 crc kubenswrapper[4756]: I0318 14:02:13.315354 4756 scope.go:117] "RemoveContainer" containerID="9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e" Mar 18 14:02:13 crc kubenswrapper[4756]: E0318 14:02:13.315676 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" Mar 18 14:02:14 crc kubenswrapper[4756]: I0318 14:02:14.258613 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.258940 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:18.258904572 +0000 UTC m=+199.573322587 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:02:14 crc kubenswrapper[4756]: I0318 14:02:14.259259 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:14 crc kubenswrapper[4756]: I0318 14:02:14.259364 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.259421 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.259524 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.259543 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:03:18.259511458 +0000 UTC m=+199.573929473 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.259606 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:03:18.2595793 +0000 UTC m=+199.573997325 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:02:14 crc kubenswrapper[4756]: I0318 14:02:14.314959 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:14 crc kubenswrapper[4756]: I0318 14:02:14.315073 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.315141 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:14 crc kubenswrapper[4756]: I0318 14:02:14.314974 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.315269 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.315348 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:14 crc kubenswrapper[4756]: I0318 14:02:14.360755 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:14 crc kubenswrapper[4756]: I0318 14:02:14.360817 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.360967 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.360985 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.360998 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.361026 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.361057 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 14:03:18.361040445 +0000 UTC m=+199.675458430 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.361063 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.361083 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.361179 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 14:03:18.361152468 +0000 UTC m=+199.675570483 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:02:14 crc kubenswrapper[4756]: E0318 14:02:14.411713 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:02:15 crc kubenswrapper[4756]: I0318 14:02:15.315463 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:15 crc kubenswrapper[4756]: E0318 14:02:15.315709 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:16 crc kubenswrapper[4756]: I0318 14:02:16.314768 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:16 crc kubenswrapper[4756]: I0318 14:02:16.314855 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:16 crc kubenswrapper[4756]: E0318 14:02:16.314885 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:16 crc kubenswrapper[4756]: I0318 14:02:16.314951 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:16 crc kubenswrapper[4756]: E0318 14:02:16.315039 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:16 crc kubenswrapper[4756]: E0318 14:02:16.315185 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:17 crc kubenswrapper[4756]: I0318 14:02:17.314574 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:17 crc kubenswrapper[4756]: E0318 14:02:17.314819 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:18 crc kubenswrapper[4756]: I0318 14:02:18.314799 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:18 crc kubenswrapper[4756]: I0318 14:02:18.314884 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:18 crc kubenswrapper[4756]: I0318 14:02:18.314820 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:18 crc kubenswrapper[4756]: E0318 14:02:18.315026 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:18 crc kubenswrapper[4756]: E0318 14:02:18.315250 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:18 crc kubenswrapper[4756]: E0318 14:02:18.315310 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.315386 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:19 crc kubenswrapper[4756]: E0318 14:02:19.315627 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.338141 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.356315 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.377686 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.400498 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: E0318 14:02:19.412659 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.422934 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.447383 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:01Z\\\",\\\"message\\\":\\\"actory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132317 6977 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:02:01.132336 6977 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:02:01.132394 6977 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:02:01.132464 6977 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:02:01.132631 6977 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0318 14:02:01.132684 6977 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:02:01.132719 6977 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:02:01.132734 6977 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 14:02:01.132750 6977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 14:02:01.132795 6977 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:02:01.132834 6977 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:02:01.132803 6977 factory.go:656] Stopping watch factory\\\\nI0318 14:02:01.132873 6977 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:02:01.132831 6977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132923 6977 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 14:02:01.133034 6977 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:02:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.465522 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.484320 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.500035 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.516165 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.535441 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b96e094-221e-4549-bdcb-010e5d2e1346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a7ae4a678ad496f072b4a15ec21271439efe888a3dcaeab650918b804f3df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9ab2bc4932c031750af26b1bee4bba55196c4a3f57252597753681ae34d37a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f5c994aeb269fd08691355c96c16c3f714e3dca2d9b8329015e9d32d4e1ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.554090 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.566851 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.579557 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.604306 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.619485 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.636434 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.648745 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:19 crc kubenswrapper[4756]: I0318 14:02:19.660692 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:19Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:20 crc kubenswrapper[4756]: I0318 14:02:20.315007 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:20 crc kubenswrapper[4756]: I0318 14:02:20.315048 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:20 crc kubenswrapper[4756]: I0318 14:02:20.315067 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:20 crc kubenswrapper[4756]: E0318 14:02:20.315294 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:20 crc kubenswrapper[4756]: E0318 14:02:20.315418 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:20 crc kubenswrapper[4756]: E0318 14:02:20.315559 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:20 crc kubenswrapper[4756]: I0318 14:02:20.987679 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:20 crc kubenswrapper[4756]: I0318 14:02:20.987787 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:20 crc kubenswrapper[4756]: I0318 14:02:20.987802 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:20 crc kubenswrapper[4756]: I0318 14:02:20.987825 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:20 crc kubenswrapper[4756]: I0318 14:02:20.987838 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:20Z","lastTransitionTime":"2026-03-18T14:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:21 crc kubenswrapper[4756]: E0318 14:02:21.008767 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:21Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.014456 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.014527 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.014545 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.014570 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.014592 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:21Z","lastTransitionTime":"2026-03-18T14:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:21 crc kubenswrapper[4756]: E0318 14:02:21.039177 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:21Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.044707 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.044742 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.044754 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.044772 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.044785 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:21Z","lastTransitionTime":"2026-03-18T14:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:21 crc kubenswrapper[4756]: E0318 14:02:21.060620 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:21Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.065109 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.065187 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.065204 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.065227 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.065245 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:21Z","lastTransitionTime":"2026-03-18T14:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:21 crc kubenswrapper[4756]: E0318 14:02:21.082861 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:21Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.087462 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.087536 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.087565 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.087592 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.087615 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:21Z","lastTransitionTime":"2026-03-18T14:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:21 crc kubenswrapper[4756]: E0318 14:02:21.107193 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:21Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:21 crc kubenswrapper[4756]: E0318 14:02:21.107512 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.315477 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:21 crc kubenswrapper[4756]: E0318 14:02:21.315691 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:21 crc kubenswrapper[4756]: I0318 14:02:21.639422 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:21 crc kubenswrapper[4756]: E0318 14:02:21.639589 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:02:21 crc kubenswrapper[4756]: E0318 14:02:21.639657 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs podName:c11d2088-741c-4812-8eb2-ccfc3d0c7d11 nodeName:}" failed. No retries permitted until 2026-03-18 14:02:53.639641031 +0000 UTC m=+174.954059026 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs") pod "network-metrics-daemon-gfdtl" (UID: "c11d2088-741c-4812-8eb2-ccfc3d0c7d11") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:02:22 crc kubenswrapper[4756]: I0318 14:02:22.314621 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:22 crc kubenswrapper[4756]: I0318 14:02:22.314653 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:22 crc kubenswrapper[4756]: I0318 14:02:22.314636 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:22 crc kubenswrapper[4756]: E0318 14:02:22.314759 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:22 crc kubenswrapper[4756]: E0318 14:02:22.314814 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:22 crc kubenswrapper[4756]: E0318 14:02:22.314995 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:23 crc kubenswrapper[4756]: I0318 14:02:23.314508 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:23 crc kubenswrapper[4756]: E0318 14:02:23.314707 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.006822 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wz5hm_13703604-4b4e-4eb2-b311-88457b667918/kube-multus/0.log" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.006900 4756 generic.go:334] "Generic (PLEG): container finished" podID="13703604-4b4e-4eb2-b311-88457b667918" containerID="2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3" exitCode=1 Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.006952 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wz5hm" event={"ID":"13703604-4b4e-4eb2-b311-88457b667918","Type":"ContainerDied","Data":"2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3"} Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.007640 4756 scope.go:117] "RemoveContainer" containerID="2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.027280 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.049077 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.065357 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.083825 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.101992 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.115015 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.130833 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b96e094-221e-4549-bdcb-010e5d2e1346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a7ae4a678ad496f072b4a15ec21271439efe888a3dcaeab650918b804f3df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9ab2bc4932c031750af26b1bee4bba55196c4a3f57252597753681ae34d37a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f5c994aeb269fd08691355c96c16c3f714e3dca2d9b8329015e9d32d4e1ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.169995 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.186936 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.203729 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.217850 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.232041 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.251027 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.270153 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.288899 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:23Z\\\",\\\"message\\\":\\\"2026-03-18T14:01:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484\\\\n2026-03-18T14:01:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484 to /host/opt/cni/bin/\\\\n2026-03-18T14:01:38Z [verbose] multus-daemon started\\\\n2026-03-18T14:01:38Z [verbose] Readiness Indicator file check\\\\n2026-03-18T14:02:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.312264 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.315382 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.315451 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:24 crc kubenswrapper[4756]: E0318 14:02:24.315596 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:24 crc kubenswrapper[4756]: E0318 14:02:24.315809 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.317941 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:24 crc kubenswrapper[4756]: E0318 14:02:24.318271 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.332437 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.357748 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:01Z\\\",\\\"message\\\":\\\"actory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132317 6977 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:02:01.132336 6977 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:02:01.132394 6977 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:02:01.132464 6977 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:02:01.132631 6977 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0318 14:02:01.132684 6977 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:02:01.132719 6977 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:02:01.132734 6977 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 14:02:01.132750 6977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 14:02:01.132795 6977 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:02:01.132834 6977 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:02:01.132803 6977 factory.go:656] Stopping watch factory\\\\nI0318 14:02:01.132873 6977 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:02:01.132831 6977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132923 6977 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 14:02:01.133034 6977 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:02:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: I0318 14:02:24.375626 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:24Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:24 crc kubenswrapper[4756]: E0318 14:02:24.414797 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.013019 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wz5hm_13703604-4b4e-4eb2-b311-88457b667918/kube-multus/0.log" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.013111 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wz5hm" event={"ID":"13703604-4b4e-4eb2-b311-88457b667918","Type":"ContainerStarted","Data":"a126ca0e73bc313c3b86c281a1627b238bb55f129d6ef0226b530e34d2b9d629"} Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.032470 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.055923 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.071115 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.089970 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.105409 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.114709 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.126182 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b96e094-221e-4549-bdcb-010e5d2e1346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a7ae4a678ad496f072b4a15ec21271439efe888a3dcaeab650918b804f3df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9ab2bc4932c031750af26b1bee4bba55196c4a3f57252597753681ae34d37a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f5c994aeb269fd08691355c96c16c3f714e3dca2d9b8329015e9d32d4e1ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.144497 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.160426 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.176863 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.189565 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.201587 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.214777 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.229115 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.241923 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a126ca0e73bc313c3b86c281a1627b238bb55f129d6ef0226b530e34d2b9d629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:23Z\\\",\\\"message\\\":\\\"2026-03-18T14:01:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484\\\\n2026-03-18T14:01:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484 to /host/opt/cni/bin/\\\\n2026-03-18T14:01:38Z [verbose] multus-daemon started\\\\n2026-03-18T14:01:38Z [verbose] Readiness Indicator file check\\\\n2026-03-18T14:02:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:02:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.257378 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.267623 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.284452 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:01Z\\\",\\\"message\\\":\\\"actory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132317 6977 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:02:01.132336 6977 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:02:01.132394 6977 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:02:01.132464 6977 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:02:01.132631 6977 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0318 14:02:01.132684 6977 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:02:01.132719 6977 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:02:01.132734 6977 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 14:02:01.132750 6977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 14:02:01.132795 6977 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:02:01.132834 6977 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:02:01.132803 6977 factory.go:656] Stopping watch factory\\\\nI0318 14:02:01.132873 6977 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:02:01.132831 6977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132923 6977 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 14:02:01.133034 6977 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:02:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.295668 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:25Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:25 crc kubenswrapper[4756]: I0318 14:02:25.315059 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:25 crc kubenswrapper[4756]: E0318 14:02:25.315192 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:26 crc kubenswrapper[4756]: I0318 14:02:26.315566 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:26 crc kubenswrapper[4756]: I0318 14:02:26.315628 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:26 crc kubenswrapper[4756]: I0318 14:02:26.315740 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:26 crc kubenswrapper[4756]: E0318 14:02:26.315787 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:26 crc kubenswrapper[4756]: E0318 14:02:26.315985 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:26 crc kubenswrapper[4756]: E0318 14:02:26.316722 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:26 crc kubenswrapper[4756]: I0318 14:02:26.317231 4756 scope.go:117] "RemoveContainer" containerID="9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.021700 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/2.log" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.024041 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerStarted","Data":"3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d"} Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.024427 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.044908 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.060516 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.082907 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:01Z\\\",\\\"message\\\":\\\"actory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132317 6977 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:02:01.132336 6977 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:02:01.132394 6977 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:02:01.132464 6977 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:02:01.132631 6977 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0318 14:02:01.132684 6977 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:02:01.132719 6977 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:02:01.132734 6977 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 14:02:01.132750 6977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 14:02:01.132795 6977 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:02:01.132834 6977 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:02:01.132803 6977 factory.go:656] Stopping watch factory\\\\nI0318 14:02:01.132873 6977 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:02:01.132831 6977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132923 6977 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 14:02:01.133034 6977 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:02:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.095915 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.111858 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.126452 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a126ca0e73bc313c3b86c281a1627b238bb55f129d6ef0226b530e34d2b9d629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:23Z\\\",\\\"message\\\":\\\"2026-03-18T14:01:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484\\\\n2026-03-18T14:01:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484 to /host/opt/cni/bin/\\\\n2026-03-18T14:01:38Z [verbose] multus-daemon started\\\\n2026-03-18T14:01:38Z [verbose] Readiness Indicator file check\\\\n2026-03-18T14:02:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:02:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.139149 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.156736 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.170390 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.184508 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.198404 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b96e094-221e-4549-bdcb-010e5d2e1346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a7ae4a678ad496f072b4a15ec21271439efe888a3dcaeab650918b804f3df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9ab2bc4932c031750af26b1bee4bba55196c4a3f57252597753681ae34d37a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f5c994aeb269fd08691355c96c16c3f714e3dca2d9b8329015e9d32d4e1ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.211695 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.223148 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.234995 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.245611 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.255384 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.266080 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.288915 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.303172 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:27Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:27 crc kubenswrapper[4756]: I0318 14:02:27.314615 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:27 crc kubenswrapper[4756]: E0318 14:02:27.314803 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.030166 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/3.log" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.031162 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/2.log" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.035074 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerID="3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d" exitCode=1 Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.035160 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d"} Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.035222 4756 scope.go:117] "RemoveContainer" containerID="9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.036318 4756 scope.go:117] "RemoveContainer" containerID="3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d" Mar 18 14:02:28 crc kubenswrapper[4756]: E0318 14:02:28.036658 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.058389 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.081972 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.103013 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a126ca0e73bc313c3b86c281a1627b238bb55f129d6ef0226b530e34d2b9d629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:23Z\\\",\\\"message\\\":\\\"2026-03-18T14:01:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484\\\\n2026-03-18T14:01:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484 to /host/opt/cni/bin/\\\\n2026-03-18T14:01:38Z [verbose] multus-daemon started\\\\n2026-03-18T14:01:38Z [verbose] Readiness Indicator file check\\\\n2026-03-18T14:02:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:02:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.128259 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.145208 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.174476 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f59b10933be126546e3f3bbfacffe5c72f72bb7da24bea02a28484321848a7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:01Z\\\",\\\"message\\\":\\\"actory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132317 6977 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:02:01.132336 6977 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:02:01.132394 6977 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:02:01.132464 6977 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:02:01.132631 6977 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0318 14:02:01.132684 6977 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 14:02:01.132719 6977 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 14:02:01.132734 6977 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 14:02:01.132750 6977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 14:02:01.132795 6977 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 14:02:01.132834 6977 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 14:02:01.132803 6977 factory.go:656] Stopping watch factory\\\\nI0318 14:02:01.132873 6977 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:02:01.132831 6977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 14:02:01.132923 6977 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 14:02:01.133034 6977 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:02:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:27Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI0318 14:02:27.190276 7243 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0318 14:02:27.190302 7243 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0318 14:02:27.190339 7243 factory.go:1336] Added *v1.Node event handler 7\\\\nI0318 14:02:27.190370 7243 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0318 14:02:27.190612 7243 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0318 14:02:27.190687 7243 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:02:27.190705 7243 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 14:02:27.190713 7243 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:02:27.190737 7243 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:02:27.190752 7243 factory.go:656] Stopping watch factory\\\\nI0318 14:02:27.190769 7243 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:02:27.190809 7243 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:02:27.190831 7243 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 14:02:27.190944 7243 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.193792 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.206424 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.221612 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.234560 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.248454 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b96e094-221e-4549-bdcb-010e5d2e1346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a7ae4a678ad496f072b4a15ec21271439efe888a3dcaeab650918b804f3df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9ab2bc4932c031750af26b1bee4bba55196c4a3f57252597753681ae34d37a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f5c994aeb269fd08691355c96c16c3f714e3dca2d9b8329015e9d32d4e1ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.262509 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.279069 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.288234 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.305053 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.314717 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.314717 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.314811 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:28 crc kubenswrapper[4756]: E0318 14:02:28.314902 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:28 crc kubenswrapper[4756]: E0318 14:02:28.314971 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:28 crc kubenswrapper[4756]: E0318 14:02:28.315183 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.317719 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.330703 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.342591 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:28 crc kubenswrapper[4756]: I0318 14:02:28.356548 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:28Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.042734 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/3.log" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.049194 4756 scope.go:117] "RemoveContainer" containerID="3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d" Mar 18 14:02:29 crc kubenswrapper[4756]: E0318 14:02:29.049553 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.066434 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.086409 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b96e094-221e-4549-bdcb-010e5d2e1346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a7ae4a678ad496f072b4a15ec21271439efe888a3dcaeab650918b804f3df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9ab2bc4932c031750af26b1bee4bba55196c4a3f57252597753681ae34d37a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f5c994aeb269fd08691355c96c16c3f714e3dca2d9b8329015e9d32d4e1ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.109751 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.127904 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.148811 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.164854 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.178061 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.193562 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.229035 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.245055 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a126ca0e73bc313c3b86c281a1627b238bb55f129d6ef0226b530e34d2b9d629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:23Z\\\",\\\"message\\\":\\\"2026-03-18T14:01:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484\\\\n2026-03-18T14:01:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484 to /host/opt/cni/bin/\\\\n2026-03-18T14:01:38Z [verbose] multus-daemon started\\\\n2026-03-18T14:01:38Z [verbose] Readiness Indicator file check\\\\n2026-03-18T14:02:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:02:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.268719 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.286876 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.314659 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:29 crc kubenswrapper[4756]: E0318 14:02:29.314846 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.324459 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:27Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI0318 14:02:27.190276 7243 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0318 14:02:27.190302 7243 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0318 14:02:27.190339 7243 factory.go:1336] Added *v1.Node event handler 7\\\\nI0318 14:02:27.190370 7243 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0318 14:02:27.190612 7243 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0318 14:02:27.190687 7243 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:02:27.190705 7243 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 14:02:27.190713 7243 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:02:27.190737 7243 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:02:27.190752 7243 factory.go:656] Stopping watch factory\\\\nI0318 14:02:27.190769 7243 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:02:27.190809 7243 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:02:27.190831 7243 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 14:02:27.190944 7243 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:02:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.344924 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.366982 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.382466 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.404360 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: E0318 14:02:29.415758 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.426681 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.443569 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.459923 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.481323 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.498689 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.516471 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.530830 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.548475 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.569267 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.588335 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.607752 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a126ca0e73bc313c3b86c281a1627b238bb55f129d6ef0226b530e34d2b9d629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:23Z\\\",\\\"message\\\":\\\"2026-03-18T14:01:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484\\\\n2026-03-18T14:01:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484 to /host/opt/cni/bin/\\\\n2026-03-18T14:01:38Z [verbose] multus-daemon started\\\\n2026-03-18T14:01:38Z [verbose] Readiness Indicator file check\\\\n2026-03-18T14:02:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:02:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.624738 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.638399 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.662147 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:27Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI0318 14:02:27.190276 7243 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0318 14:02:27.190302 7243 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0318 14:02:27.190339 7243 factory.go:1336] Added *v1.Node event handler 7\\\\nI0318 14:02:27.190370 7243 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0318 14:02:27.190612 7243 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0318 14:02:27.190687 7243 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:02:27.190705 7243 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 14:02:27.190713 7243 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:02:27.190737 7243 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:02:27.190752 7243 factory.go:656] Stopping watch factory\\\\nI0318 14:02:27.190769 7243 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:02:27.190809 7243 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:02:27.190831 7243 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 14:02:27.190944 7243 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:02:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.681443 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.700333 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.722994 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.738040 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.754606 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b96e094-221e-4549-bdcb-010e5d2e1346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a7ae4a678ad496f072b4a15ec21271439efe888a3dcaeab650918b804f3df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9ab2bc4932c031750af26b1bee4bba55196c4a3f57252597753681ae34d37a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f5c994aeb269fd08691355c96c16c3f714e3dca2d9b8329015e9d32d4e1ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.769811 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:29 crc kubenswrapper[4756]: I0318 14:02:29.779992 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:29Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:30 crc kubenswrapper[4756]: I0318 14:02:30.314941 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:30 crc kubenswrapper[4756]: I0318 14:02:30.315013 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:30 crc kubenswrapper[4756]: I0318 14:02:30.314941 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:30 crc kubenswrapper[4756]: E0318 14:02:30.315213 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:30 crc kubenswrapper[4756]: E0318 14:02:30.315320 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:30 crc kubenswrapper[4756]: E0318 14:02:30.315433 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.154692 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.154755 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.154779 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.154809 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.154826 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:31Z","lastTransitionTime":"2026-03-18T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:31 crc kubenswrapper[4756]: E0318 14:02:31.176395 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:31Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.181568 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.181627 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.181649 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.181680 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.181703 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:31Z","lastTransitionTime":"2026-03-18T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:31 crc kubenswrapper[4756]: E0318 14:02:31.202025 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:31Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.206434 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.206493 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.206512 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.206534 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.206552 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:31Z","lastTransitionTime":"2026-03-18T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:31 crc kubenswrapper[4756]: E0318 14:02:31.226546 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:31Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.231722 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.231761 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.231775 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.231794 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.231809 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:31Z","lastTransitionTime":"2026-03-18T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:31 crc kubenswrapper[4756]: E0318 14:02:31.245424 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:31Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.250285 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.250335 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.250350 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.250370 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.250386 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:31Z","lastTransitionTime":"2026-03-18T14:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:31 crc kubenswrapper[4756]: E0318 14:02:31.267638 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:31Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:31 crc kubenswrapper[4756]: E0318 14:02:31.267867 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 14:02:31 crc kubenswrapper[4756]: I0318 14:02:31.314983 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:31 crc kubenswrapper[4756]: E0318 14:02:31.315307 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:32 crc kubenswrapper[4756]: I0318 14:02:32.314824 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:32 crc kubenswrapper[4756]: I0318 14:02:32.314871 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:32 crc kubenswrapper[4756]: I0318 14:02:32.314913 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:32 crc kubenswrapper[4756]: E0318 14:02:32.315046 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:32 crc kubenswrapper[4756]: E0318 14:02:32.315233 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:32 crc kubenswrapper[4756]: E0318 14:02:32.315361 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:33 crc kubenswrapper[4756]: I0318 14:02:33.315511 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:33 crc kubenswrapper[4756]: E0318 14:02:33.315762 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:34 crc kubenswrapper[4756]: I0318 14:02:34.314663 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:34 crc kubenswrapper[4756]: I0318 14:02:34.314691 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:34 crc kubenswrapper[4756]: I0318 14:02:34.314801 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:34 crc kubenswrapper[4756]: E0318 14:02:34.314816 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:34 crc kubenswrapper[4756]: E0318 14:02:34.314938 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:34 crc kubenswrapper[4756]: E0318 14:02:34.315069 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:34 crc kubenswrapper[4756]: E0318 14:02:34.417663 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:02:35 crc kubenswrapper[4756]: I0318 14:02:35.315415 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:35 crc kubenswrapper[4756]: E0318 14:02:35.315672 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:36 crc kubenswrapper[4756]: I0318 14:02:36.315491 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:36 crc kubenswrapper[4756]: I0318 14:02:36.315520 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:36 crc kubenswrapper[4756]: I0318 14:02:36.315584 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:36 crc kubenswrapper[4756]: E0318 14:02:36.315675 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:36 crc kubenswrapper[4756]: E0318 14:02:36.315781 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:36 crc kubenswrapper[4756]: E0318 14:02:36.315934 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:37 crc kubenswrapper[4756]: I0318 14:02:37.315405 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:37 crc kubenswrapper[4756]: E0318 14:02:37.315606 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:38 crc kubenswrapper[4756]: I0318 14:02:38.314916 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:38 crc kubenswrapper[4756]: E0318 14:02:38.315383 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:38 crc kubenswrapper[4756]: I0318 14:02:38.314989 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:38 crc kubenswrapper[4756]: E0318 14:02:38.315539 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:38 crc kubenswrapper[4756]: I0318 14:02:38.314919 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:38 crc kubenswrapper[4756]: E0318 14:02:38.315640 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.314554 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:39 crc kubenswrapper[4756]: E0318 14:02:39.314989 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.329398 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b96e094-221e-4549-bdcb-010e5d2e1346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a7ae4a678ad496f072b4a15ec21271439efe888a3dcaeab650918b804f3df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9ab2bc4932c031750af26b1bee4bba55196c4a3f57252597753681ae34d37a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f5c994aeb269fd08691355c96c16c3f714e3dca2d9b8329015e9d32d4e1ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ac2494ae20393fe275de98f946a29de7eb1cd1b7850bc6bde8c24a3a19cb471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.343077 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2265f7f44e618f83470e6549203b6fe5fc835af7ae7132a8d7369f41a59478a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.353656 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9xtp5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c68d4bce-1eb0-4ec4-99ae-4e901a9720ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c91288f00d66414f1dd3e8b997bc4d04e0e6d59c8224bb327cebecaa65d0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vmzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9xtp5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.364503 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"659e7fce-9d29-43c1-bb0b-ee43f9de3a16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1b88a20c05ac947a294c2cd0bb0f6bdb5bfc7ec44cbaa0d2accd9a2ad1eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a93c9f434c3a4960287d009199c43477d955c2bebceead1062510aa23215713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.386828 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e310797-439d-4233-b836-eda001687f95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd134740cd5838871bf22ef520c4b6bbea55fe9a71f7e62ed79c95278f6501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e318ca9ee1156d80fdc5a90cad7532d455f47934b81bf0d021aef26a4e81e913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://796ae83faa28cdf7c9d6430446142c700018b6a0a36a8b8320697eb8337f06ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bafb1c5317239b78ebba99044fab8dc260dcdefebce1e3715f4261fde578e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb5ade4109bc9eae0cc0f4071b2e5908d5e17f8558914a773412ba8f696d4f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0887c7848914176bfa5ae96de904588f541e7353ced7701c8eec642b3f6fde94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84532cc27d990f2f54164d8650142510b9ddd56a16649bbe8c1f5b5414ce23c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c449420b247fadc1125167e082cd3561b855bc3b767b607c354d44bd32bd212a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.400854 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabc49ea4337e5c045fdc629d0019a7c7d15df989015acf4bea11664689ab789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4d1289e79bfba0ba2105a20d4473fe3fe2bfd01e68bab974ab1b2acd164cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: E0318 14:02:39.418329 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.420199 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c57d0ff3f12049069bf27365f93c03b86e9105f5642109d65222a1b22fccb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.434007 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fg65c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc241e7b-956f-4c3e-be3e-e239872d2c3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47c4086ca923f9407d06f42ffa1a44cfa67e037f3692782a192e701cbabed86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvtfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fg65c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.445390 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtqtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gfdtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.456277 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf3f6086-068f-4d8a-b24e-e8abb85ed346\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df49735023c19f361a14b1beb2bc8e24fb8dc022d5ec13265787b57242304b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f32a1883d53a644dcd74b35848533885ba8898c4d3feb8b3c73ca884cd1ae76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:00:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 14:00:01.327493 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 14:00:01.329968 1 observer_polling.go:159] Starting file observer\\\\nI0318 14:00:01.364670 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 14:00:01.367515 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 14:00:27.802989 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 14:00:27.803068 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:00:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acedcf0b285d570336220592028594ad63d5e998fb2124f1d74e75d4b54ca446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d43be82bbb7edfc6e24fe7a29ba15ae720fa04f7b827bd7715d4d1db025b94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.466696 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.479571 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wz5hm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13703604-4b4e-4eb2-b311-88457b667918\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a126ca0e73bc313c3b86c281a1627b238bb55f129d6ef0226b530e34d2b9d629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:23Z\\\",\\\"message\\\":\\\"2026-03-18T14:01:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484\\\\n2026-03-18T14:01:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_809a24e4-997b-4eb5-b078-38833d262484 to /host/opt/cni/bin/\\\\n2026-03-18T14:01:38Z [verbose] multus-daemon started\\\\n2026-03-18T14:01:38Z [verbose] Readiness Indicator file check\\\\n2026-03-18T14:02:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:02:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d6fl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wz5hm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.492609 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c4e85b-faff-4aca-847c-f33570c542a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9f56502b9eaaf11938c290a3e453abb8459fdaeb1198282a8b5ceeeded828f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b458a414ad60c1cc3aec701a1461272b1c67af98084b56167600680cff8aa08f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29d05f5528f26b9dcab4870d05b23c12c748c4c017d6534effef5e68455553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5d9e22ece9d6606daecc7d8a2eb658c30196d6a19f937c50a395d4150ee336\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e067bf79578f200b2ac8615b62dc3b645dc44f3ade1a30424849c78c28a04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09014fbaad8c20a5508fbb334b0390359a46d3be984ed434e8fb5af5a0b5ecfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d3011b25927c599474a2678b9557800f9c3f942a6e0264c293f956b26ad0bd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47djd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9pzw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.503205 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c10ecbb9-ddab-48c7-9a86-abd122951622\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb5d360598242399969247a3bee8e2c36994f220e222c1d179da7582da84a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xppvk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qvpkg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.522614 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7cf6c03-98fc-4724-acde-a38f32f87496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T14:02:27Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI0318 14:02:27.190276 7243 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0318 14:02:27.190302 7243 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0318 14:02:27.190339 7243 factory.go:1336] Added *v1.Node event handler 7\\\\nI0318 14:02:27.190370 7243 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0318 14:02:27.190612 7243 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0318 14:02:27.190687 7243 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 14:02:27.190705 7243 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 14:02:27.190713 7243 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 14:02:27.190737 7243 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 14:02:27.190752 7243 factory.go:656] Stopping watch factory\\\\nI0318 14:02:27.190769 7243 ovnkube.go:599] Stopped ovnkube\\\\nI0318 14:02:27.190809 7243 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 14:02:27.190831 7243 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 14:02:27.190944 7243 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:02:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:01:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9q6cv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hgh2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.536788 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9ee2d78-d52a-4766-aa8e-f68a998a4df5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:00:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T13:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T14:01:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0318 14:01:08.137587 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 14:01:08.137715 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 14:01:08.138459 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2163962342/tls.crt::/tmp/serving-cert-2163962342/tls.key\\\\\\\"\\\\nI0318 14:01:08.338071 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 14:01:08.342232 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 14:01:08.342265 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 14:01:08.342295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 14:01:08.342305 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 14:01:08.351480 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 14:01:08.351509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 14:01:08.351520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 14:01:08.351525 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 14:01:08.351528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 14:01:08.351533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 14:01:08.351548 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 14:01:08.353963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T14:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:00:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T14:00:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T14:00:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T13:59:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.546846 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.556515 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:39 crc kubenswrapper[4756]: I0318 14:02:39.568286 4756 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da217a0c-d8f3-4de1-b997-28d6683ede25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T14:01:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fbfc5e5297f222390a666aa00fdf98934ef0430057acd3d94eb7a0a7c554d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcfd64e934ef06fe9cf21d4b36ae864bd59e8807ed5d15684d3ea0a7e92e96e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T14:01:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfbbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T14:01:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mdjt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:39Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:40 crc kubenswrapper[4756]: I0318 14:02:40.315158 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:40 crc kubenswrapper[4756]: I0318 14:02:40.315302 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:40 crc kubenswrapper[4756]: I0318 14:02:40.315409 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:40 crc kubenswrapper[4756]: E0318 14:02:40.315476 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:40 crc kubenswrapper[4756]: E0318 14:02:40.315686 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:40 crc kubenswrapper[4756]: E0318 14:02:40.315740 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.285991 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.286075 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.286102 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.286169 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.286196 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:41Z","lastTransitionTime":"2026-03-18T14:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:41 crc kubenswrapper[4756]: E0318 14:02:41.306162 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.310930 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.310978 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.310994 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.311024 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.311045 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:41Z","lastTransitionTime":"2026-03-18T14:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.314392 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:41 crc kubenswrapper[4756]: E0318 14:02:41.314523 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.315064 4756 scope.go:117] "RemoveContainer" containerID="3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d" Mar 18 14:02:41 crc kubenswrapper[4756]: E0318 14:02:41.315221 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" Mar 18 14:02:41 crc kubenswrapper[4756]: E0318 14:02:41.324519 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.328668 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.328713 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.328727 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.328748 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.328762 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:41Z","lastTransitionTime":"2026-03-18T14:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:41 crc kubenswrapper[4756]: E0318 14:02:41.346848 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.350195 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.350231 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.350243 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.350261 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.350273 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:41Z","lastTransitionTime":"2026-03-18T14:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:41 crc kubenswrapper[4756]: E0318 14:02:41.367952 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.371019 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.371217 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.371571 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.371745 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:41 crc kubenswrapper[4756]: I0318 14:02:41.371879 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:41Z","lastTransitionTime":"2026-03-18T14:02:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:41 crc kubenswrapper[4756]: E0318 14:02:41.387765 4756 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T14:02:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"aa49f241-7e2e-4961-9f17-9d946c9cd47b\\\",\\\"systemUUID\\\":\\\"4e5f0c17-af84-42a1-a5d9-87f2a3f7aad2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T14:02:41Z is after 2025-08-24T17:21:41Z" Mar 18 14:02:41 crc kubenswrapper[4756]: E0318 14:02:41.387915 4756 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 14:02:42 crc kubenswrapper[4756]: I0318 14:02:42.314751 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:42 crc kubenswrapper[4756]: I0318 14:02:42.314796 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:42 crc kubenswrapper[4756]: I0318 14:02:42.314767 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:42 crc kubenswrapper[4756]: E0318 14:02:42.314895 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:42 crc kubenswrapper[4756]: E0318 14:02:42.314998 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:42 crc kubenswrapper[4756]: E0318 14:02:42.315077 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:43 crc kubenswrapper[4756]: I0318 14:02:43.314665 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:43 crc kubenswrapper[4756]: E0318 14:02:43.314829 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:44 crc kubenswrapper[4756]: I0318 14:02:44.315394 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:44 crc kubenswrapper[4756]: I0318 14:02:44.315504 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:44 crc kubenswrapper[4756]: I0318 14:02:44.315504 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:44 crc kubenswrapper[4756]: E0318 14:02:44.315638 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:44 crc kubenswrapper[4756]: E0318 14:02:44.315890 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:44 crc kubenswrapper[4756]: E0318 14:02:44.316063 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:44 crc kubenswrapper[4756]: E0318 14:02:44.420446 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:02:45 crc kubenswrapper[4756]: I0318 14:02:45.315561 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:45 crc kubenswrapper[4756]: E0318 14:02:45.315755 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:46 crc kubenswrapper[4756]: I0318 14:02:46.315401 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:46 crc kubenswrapper[4756]: I0318 14:02:46.315440 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:46 crc kubenswrapper[4756]: I0318 14:02:46.315510 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:46 crc kubenswrapper[4756]: E0318 14:02:46.315598 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:46 crc kubenswrapper[4756]: E0318 14:02:46.315804 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:46 crc kubenswrapper[4756]: E0318 14:02:46.315944 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:47 crc kubenswrapper[4756]: I0318 14:02:47.314993 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:47 crc kubenswrapper[4756]: E0318 14:02:47.315285 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:48 crc kubenswrapper[4756]: I0318 14:02:48.314964 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:48 crc kubenswrapper[4756]: I0318 14:02:48.314964 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:48 crc kubenswrapper[4756]: I0318 14:02:48.315008 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:48 crc kubenswrapper[4756]: E0318 14:02:48.315276 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:48 crc kubenswrapper[4756]: E0318 14:02:48.315390 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:48 crc kubenswrapper[4756]: E0318 14:02:48.315519 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:49 crc kubenswrapper[4756]: I0318 14:02:49.315036 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:49 crc kubenswrapper[4756]: E0318 14:02:49.315350 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:49 crc kubenswrapper[4756]: I0318 14:02:49.362719 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=100.362682568 podStartE2EDuration="1m40.362682568s" podCreationTimestamp="2026-03-18 14:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:02:49.35797111 +0000 UTC m=+170.672389145" watchObservedRunningTime="2026-03-18 14:02:49.362682568 +0000 UTC m=+170.677100543" Mar 18 14:02:49 crc kubenswrapper[4756]: I0318 14:02:49.410555 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mdjt4" podStartSLOduration=113.410513607 podStartE2EDuration="1m53.410513607s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:02:49.410133136 +0000 UTC m=+170.724551111" watchObservedRunningTime="2026-03-18 14:02:49.410513607 +0000 UTC m=+170.724931582" Mar 18 14:02:49 crc kubenswrapper[4756]: E0318 14:02:49.421006 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:02:49 crc kubenswrapper[4756]: I0318 14:02:49.430029 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.430010456 podStartE2EDuration="48.430010456s" podCreationTimestamp="2026-03-18 14:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:02:49.429273646 +0000 UTC m=+170.743691631" watchObservedRunningTime="2026-03-18 14:02:49.430010456 +0000 UTC m=+170.744428431" Mar 18 14:02:49 crc kubenswrapper[4756]: I0318 14:02:49.470777 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9xtp5" podStartSLOduration=113.470747762 podStartE2EDuration="1m53.470747762s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:02:49.458921771 +0000 UTC m=+170.773339756" watchObservedRunningTime="2026-03-18 14:02:49.470747762 +0000 UTC m=+170.785165737" Mar 18 14:02:49 crc kubenswrapper[4756]: I0318 14:02:49.485864 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=93.485841702 podStartE2EDuration="1m33.485841702s" podCreationTimestamp="2026-03-18 14:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:02:49.485580435 +0000 UTC m=+170.799998410" watchObservedRunningTime="2026-03-18 14:02:49.485841702 +0000 UTC m=+170.800259677" Mar 18 14:02:49 crc kubenswrapper[4756]: I0318 14:02:49.512608 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=80.512589198 podStartE2EDuration="1m20.512589198s" podCreationTimestamp="2026-03-18 14:01:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:02:49.511633943 +0000 UTC m=+170.826051928" watchObservedRunningTime="2026-03-18 14:02:49.512589198 +0000 UTC m=+170.827007183" Mar 18 14:02:49 crc kubenswrapper[4756]: I0318 14:02:49.551712 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fg65c" podStartSLOduration=113.55169315 podStartE2EDuration="1m53.55169315s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:02:49.551545886 +0000 UTC m=+170.865963881" watchObservedRunningTime="2026-03-18 14:02:49.55169315 +0000 UTC m=+170.866111125" Mar 18 14:02:49 crc kubenswrapper[4756]: I0318 14:02:49.599092 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=51.599070777 podStartE2EDuration="51.599070777s" podCreationTimestamp="2026-03-18 14:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:02:49.596642671 +0000 UTC m=+170.911060686" watchObservedRunningTime="2026-03-18 14:02:49.599070777 +0000 UTC m=+170.913488792" Mar 18 14:02:49 crc kubenswrapper[4756]: I0318 14:02:49.629257 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wz5hm" podStartSLOduration=113.629234116 podStartE2EDuration="1m53.629234116s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:02:49.628860896 +0000 UTC m=+170.943278871" watchObservedRunningTime="2026-03-18 14:02:49.629234116 +0000 UTC m=+170.943652101" Mar 18 14:02:49 crc kubenswrapper[4756]: I0318 14:02:49.654606 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b9pzw" podStartSLOduration=113.654574475 podStartE2EDuration="1m53.654574475s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:02:49.654216035 +0000 UTC m=+170.968634020" watchObservedRunningTime="2026-03-18 14:02:49.654574475 +0000 UTC m=+170.968992460" Mar 18 14:02:50 crc kubenswrapper[4756]: I0318 14:02:50.315249 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:50 crc kubenswrapper[4756]: I0318 14:02:50.315362 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:50 crc kubenswrapper[4756]: E0318 14:02:50.315390 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:50 crc kubenswrapper[4756]: E0318 14:02:50.315725 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:50 crc kubenswrapper[4756]: I0318 14:02:50.315857 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:50 crc kubenswrapper[4756]: E0318 14:02:50.315936 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.315378 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:51 crc kubenswrapper[4756]: E0318 14:02:51.315665 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.668644 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.668678 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.668687 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.668699 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.668709 4756 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T14:02:51Z","lastTransitionTime":"2026-03-18T14:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.716729 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podStartSLOduration=115.716706696 podStartE2EDuration="1m55.716706696s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:02:49.672802649 +0000 UTC m=+170.987220654" watchObservedRunningTime="2026-03-18 14:02:51.716706696 +0000 UTC m=+173.031124681" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.718110 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx"] Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.718507 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.720220 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.720698 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.721182 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.721503 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.871796 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a12d472b-6b1f-48c8-aa13-91f718b9a50a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.871870 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a12d472b-6b1f-48c8-aa13-91f718b9a50a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.871946 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a12d472b-6b1f-48c8-aa13-91f718b9a50a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.871997 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a12d472b-6b1f-48c8-aa13-91f718b9a50a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.872040 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a12d472b-6b1f-48c8-aa13-91f718b9a50a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.973099 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a12d472b-6b1f-48c8-aa13-91f718b9a50a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.973161 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a12d472b-6b1f-48c8-aa13-91f718b9a50a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.973190 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a12d472b-6b1f-48c8-aa13-91f718b9a50a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.973219 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a12d472b-6b1f-48c8-aa13-91f718b9a50a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.973242 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a12d472b-6b1f-48c8-aa13-91f718b9a50a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.973428 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a12d472b-6b1f-48c8-aa13-91f718b9a50a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.973460 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a12d472b-6b1f-48c8-aa13-91f718b9a50a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.975163 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a12d472b-6b1f-48c8-aa13-91f718b9a50a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.980591 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a12d472b-6b1f-48c8-aa13-91f718b9a50a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:51 crc kubenswrapper[4756]: I0318 14:02:51.994612 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a12d472b-6b1f-48c8-aa13-91f718b9a50a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6nhdx\" (UID: \"a12d472b-6b1f-48c8-aa13-91f718b9a50a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:52 crc kubenswrapper[4756]: I0318 14:02:52.034321 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" Mar 18 14:02:52 crc kubenswrapper[4756]: I0318 14:02:52.133235 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" event={"ID":"a12d472b-6b1f-48c8-aa13-91f718b9a50a","Type":"ContainerStarted","Data":"eb049c31c337ea5827682882d905a973a0f03c5e2067971e9b4d46a2f5b785ce"} Mar 18 14:02:52 crc kubenswrapper[4756]: I0318 14:02:52.314651 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:52 crc kubenswrapper[4756]: I0318 14:02:52.314755 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:52 crc kubenswrapper[4756]: E0318 14:02:52.314830 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:52 crc kubenswrapper[4756]: I0318 14:02:52.314874 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:52 crc kubenswrapper[4756]: E0318 14:02:52.315077 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:52 crc kubenswrapper[4756]: E0318 14:02:52.315260 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:52 crc kubenswrapper[4756]: I0318 14:02:52.367350 4756 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 14:02:52 crc kubenswrapper[4756]: I0318 14:02:52.380010 4756 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 14:02:53 crc kubenswrapper[4756]: I0318 14:02:53.138606 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" event={"ID":"a12d472b-6b1f-48c8-aa13-91f718b9a50a","Type":"ContainerStarted","Data":"2d45fc9c10665df25c3d152d920ff6234dc6829ceea53e38ae78262ce1307ec3"} Mar 18 14:02:53 crc kubenswrapper[4756]: I0318 14:02:53.155335 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhdx" podStartSLOduration=117.155313854 podStartE2EDuration="1m57.155313854s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:02:53.154700727 +0000 UTC m=+174.469118712" watchObservedRunningTime="2026-03-18 14:02:53.155313854 +0000 UTC m=+174.469731839" Mar 18 14:02:53 crc kubenswrapper[4756]: I0318 14:02:53.314889 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:53 crc kubenswrapper[4756]: E0318 14:02:53.315093 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:53 crc kubenswrapper[4756]: I0318 14:02:53.691357 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:53 crc kubenswrapper[4756]: E0318 14:02:53.691507 4756 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:02:53 crc kubenswrapper[4756]: E0318 14:02:53.691600 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs podName:c11d2088-741c-4812-8eb2-ccfc3d0c7d11 nodeName:}" failed. No retries permitted until 2026-03-18 14:03:57.691577837 +0000 UTC m=+239.005995802 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs") pod "network-metrics-daemon-gfdtl" (UID: "c11d2088-741c-4812-8eb2-ccfc3d0c7d11") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 14:02:54 crc kubenswrapper[4756]: I0318 14:02:54.314480 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:54 crc kubenswrapper[4756]: I0318 14:02:54.314534 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:54 crc kubenswrapper[4756]: I0318 14:02:54.314709 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:54 crc kubenswrapper[4756]: E0318 14:02:54.314813 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:54 crc kubenswrapper[4756]: E0318 14:02:54.315851 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:54 crc kubenswrapper[4756]: E0318 14:02:54.315969 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:54 crc kubenswrapper[4756]: I0318 14:02:54.316219 4756 scope.go:117] "RemoveContainer" containerID="3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d" Mar 18 14:02:54 crc kubenswrapper[4756]: E0318 14:02:54.316533 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hgh2m_openshift-ovn-kubernetes(c7cf6c03-98fc-4724-acde-a38f32f87496)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" Mar 18 14:02:54 crc kubenswrapper[4756]: E0318 14:02:54.422330 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:02:55 crc kubenswrapper[4756]: I0318 14:02:55.314727 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:55 crc kubenswrapper[4756]: E0318 14:02:55.314913 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:56 crc kubenswrapper[4756]: I0318 14:02:56.314757 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:56 crc kubenswrapper[4756]: I0318 14:02:56.314820 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:56 crc kubenswrapper[4756]: I0318 14:02:56.314842 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:56 crc kubenswrapper[4756]: E0318 14:02:56.315264 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:56 crc kubenswrapper[4756]: E0318 14:02:56.315728 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:56 crc kubenswrapper[4756]: E0318 14:02:56.315808 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:57 crc kubenswrapper[4756]: I0318 14:02:57.315505 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:57 crc kubenswrapper[4756]: E0318 14:02:57.315740 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:58 crc kubenswrapper[4756]: I0318 14:02:58.314435 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:02:58 crc kubenswrapper[4756]: I0318 14:02:58.314522 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:02:58 crc kubenswrapper[4756]: I0318 14:02:58.314902 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:02:58 crc kubenswrapper[4756]: E0318 14:02:58.315092 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:02:58 crc kubenswrapper[4756]: E0318 14:02:58.315273 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:02:58 crc kubenswrapper[4756]: E0318 14:02:58.315449 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:02:59 crc kubenswrapper[4756]: I0318 14:02:59.317712 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:02:59 crc kubenswrapper[4756]: E0318 14:02:59.317984 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:02:59 crc kubenswrapper[4756]: E0318 14:02:59.423027 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:03:00 crc kubenswrapper[4756]: I0318 14:03:00.314962 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:00 crc kubenswrapper[4756]: I0318 14:03:00.315026 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:00 crc kubenswrapper[4756]: I0318 14:03:00.314968 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:00 crc kubenswrapper[4756]: E0318 14:03:00.315143 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:00 crc kubenswrapper[4756]: E0318 14:03:00.315341 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:00 crc kubenswrapper[4756]: E0318 14:03:00.315404 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:01 crc kubenswrapper[4756]: I0318 14:03:01.315411 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:01 crc kubenswrapper[4756]: E0318 14:03:01.316483 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:02 crc kubenswrapper[4756]: I0318 14:03:02.315070 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:02 crc kubenswrapper[4756]: I0318 14:03:02.315158 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:02 crc kubenswrapper[4756]: I0318 14:03:02.315163 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:02 crc kubenswrapper[4756]: E0318 14:03:02.315875 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:02 crc kubenswrapper[4756]: E0318 14:03:02.315992 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:02 crc kubenswrapper[4756]: E0318 14:03:02.315616 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:03 crc kubenswrapper[4756]: I0318 14:03:03.315300 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:03 crc kubenswrapper[4756]: E0318 14:03:03.316079 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:04 crc kubenswrapper[4756]: I0318 14:03:04.314802 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:04 crc kubenswrapper[4756]: I0318 14:03:04.314897 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:04 crc kubenswrapper[4756]: I0318 14:03:04.314982 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:04 crc kubenswrapper[4756]: E0318 14:03:04.315496 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:04 crc kubenswrapper[4756]: E0318 14:03:04.315315 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:04 crc kubenswrapper[4756]: E0318 14:03:04.315637 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:04 crc kubenswrapper[4756]: E0318 14:03:04.424286 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:03:05 crc kubenswrapper[4756]: I0318 14:03:05.315013 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:05 crc kubenswrapper[4756]: E0318 14:03:05.315240 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:06 crc kubenswrapper[4756]: I0318 14:03:06.314692 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:06 crc kubenswrapper[4756]: I0318 14:03:06.314787 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:06 crc kubenswrapper[4756]: I0318 14:03:06.314949 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:06 crc kubenswrapper[4756]: E0318 14:03:06.314949 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:06 crc kubenswrapper[4756]: E0318 14:03:06.315141 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:06 crc kubenswrapper[4756]: E0318 14:03:06.315204 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:07 crc kubenswrapper[4756]: I0318 14:03:07.315436 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:07 crc kubenswrapper[4756]: E0318 14:03:07.315655 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:08 crc kubenswrapper[4756]: I0318 14:03:08.315026 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:08 crc kubenswrapper[4756]: E0318 14:03:08.315248 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:08 crc kubenswrapper[4756]: I0318 14:03:08.315293 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:08 crc kubenswrapper[4756]: I0318 14:03:08.315356 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:08 crc kubenswrapper[4756]: E0318 14:03:08.315823 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:08 crc kubenswrapper[4756]: E0318 14:03:08.315974 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:08 crc kubenswrapper[4756]: I0318 14:03:08.316576 4756 scope.go:117] "RemoveContainer" containerID="3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d" Mar 18 14:03:09 crc kubenswrapper[4756]: I0318 14:03:09.196732 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/3.log" Mar 18 14:03:09 crc kubenswrapper[4756]: I0318 14:03:09.200760 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerStarted","Data":"b382ec1fcb9424f71540b6ace5af00998ac0b7cf5b869c2c61709c03b6fa1c94"} Mar 18 14:03:09 crc kubenswrapper[4756]: I0318 14:03:09.201349 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:03:09 crc kubenswrapper[4756]: I0318 14:03:09.220777 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gfdtl"] Mar 18 14:03:09 crc kubenswrapper[4756]: I0318 14:03:09.220898 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:09 crc kubenswrapper[4756]: E0318 14:03:09.220999 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:09 crc kubenswrapper[4756]: I0318 14:03:09.241913 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podStartSLOduration=133.241892588 podStartE2EDuration="2m13.241892588s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:09.241402485 +0000 UTC m=+190.555820480" watchObservedRunningTime="2026-03-18 14:03:09.241892588 +0000 UTC m=+190.556310573" Mar 18 14:03:09 crc kubenswrapper[4756]: E0318 14:03:09.425899 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:03:10 crc kubenswrapper[4756]: I0318 14:03:10.208274 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wz5hm_13703604-4b4e-4eb2-b311-88457b667918/kube-multus/1.log" Mar 18 14:03:10 crc kubenswrapper[4756]: I0318 14:03:10.209054 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wz5hm_13703604-4b4e-4eb2-b311-88457b667918/kube-multus/0.log" Mar 18 14:03:10 crc kubenswrapper[4756]: I0318 14:03:10.209230 4756 generic.go:334] "Generic (PLEG): container finished" podID="13703604-4b4e-4eb2-b311-88457b667918" containerID="a126ca0e73bc313c3b86c281a1627b238bb55f129d6ef0226b530e34d2b9d629" exitCode=1 Mar 18 14:03:10 crc kubenswrapper[4756]: I0318 14:03:10.209504 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wz5hm" event={"ID":"13703604-4b4e-4eb2-b311-88457b667918","Type":"ContainerDied","Data":"a126ca0e73bc313c3b86c281a1627b238bb55f129d6ef0226b530e34d2b9d629"} Mar 18 14:03:10 crc kubenswrapper[4756]: I0318 14:03:10.209596 4756 scope.go:117] "RemoveContainer" containerID="2bb725de607f819c1700665d29cffa433fd9d8eb0345e3601c6437c5371a64c3" Mar 18 14:03:10 crc kubenswrapper[4756]: I0318 14:03:10.211742 4756 scope.go:117] "RemoveContainer" containerID="a126ca0e73bc313c3b86c281a1627b238bb55f129d6ef0226b530e34d2b9d629" Mar 18 14:03:10 crc kubenswrapper[4756]: E0318 14:03:10.212172 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-wz5hm_openshift-multus(13703604-4b4e-4eb2-b311-88457b667918)\"" pod="openshift-multus/multus-wz5hm" podUID="13703604-4b4e-4eb2-b311-88457b667918" Mar 18 14:03:10 crc kubenswrapper[4756]: I0318 14:03:10.314938 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:10 crc kubenswrapper[4756]: I0318 14:03:10.314983 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:10 crc kubenswrapper[4756]: I0318 14:03:10.315019 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:10 crc kubenswrapper[4756]: E0318 14:03:10.315091 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:10 crc kubenswrapper[4756]: E0318 14:03:10.315306 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:10 crc kubenswrapper[4756]: E0318 14:03:10.315778 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:11 crc kubenswrapper[4756]: I0318 14:03:11.215066 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wz5hm_13703604-4b4e-4eb2-b311-88457b667918/kube-multus/1.log" Mar 18 14:03:11 crc kubenswrapper[4756]: I0318 14:03:11.314658 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:11 crc kubenswrapper[4756]: E0318 14:03:11.314932 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:12 crc kubenswrapper[4756]: I0318 14:03:12.315381 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:12 crc kubenswrapper[4756]: I0318 14:03:12.315402 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:12 crc kubenswrapper[4756]: I0318 14:03:12.315597 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:12 crc kubenswrapper[4756]: E0318 14:03:12.315723 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:12 crc kubenswrapper[4756]: E0318 14:03:12.315889 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:12 crc kubenswrapper[4756]: E0318 14:03:12.316142 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:13 crc kubenswrapper[4756]: I0318 14:03:13.315447 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:13 crc kubenswrapper[4756]: E0318 14:03:13.315693 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:14 crc kubenswrapper[4756]: I0318 14:03:14.315003 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:14 crc kubenswrapper[4756]: E0318 14:03:14.315474 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:14 crc kubenswrapper[4756]: I0318 14:03:14.315173 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:14 crc kubenswrapper[4756]: I0318 14:03:14.315078 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:14 crc kubenswrapper[4756]: E0318 14:03:14.315558 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:14 crc kubenswrapper[4756]: E0318 14:03:14.315708 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:14 crc kubenswrapper[4756]: E0318 14:03:14.426999 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:03:15 crc kubenswrapper[4756]: I0318 14:03:15.318836 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:15 crc kubenswrapper[4756]: E0318 14:03:15.318984 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:16 crc kubenswrapper[4756]: I0318 14:03:16.315443 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:16 crc kubenswrapper[4756]: I0318 14:03:16.315466 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:16 crc kubenswrapper[4756]: I0318 14:03:16.315580 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:16 crc kubenswrapper[4756]: E0318 14:03:16.315689 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:16 crc kubenswrapper[4756]: E0318 14:03:16.315746 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:16 crc kubenswrapper[4756]: E0318 14:03:16.315821 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:17 crc kubenswrapper[4756]: I0318 14:03:17.315102 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:17 crc kubenswrapper[4756]: E0318 14:03:17.315301 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:18 crc kubenswrapper[4756]: I0318 14:03:18.266743 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:18 crc kubenswrapper[4756]: I0318 14:03:18.266898 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:18 crc kubenswrapper[4756]: I0318 14:03:18.266969 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.267028 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:05:20.266997488 +0000 UTC m=+321.581415473 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.267080 4756 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.267177 4756 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.267207 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:05:20.267176972 +0000 UTC m=+321.581594987 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.267301 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 14:05:20.267274455 +0000 UTC m=+321.581692490 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 14:03:18 crc kubenswrapper[4756]: I0318 14:03:18.315101 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:18 crc kubenswrapper[4756]: I0318 14:03:18.315178 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:18 crc kubenswrapper[4756]: I0318 14:03:18.315190 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.315297 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.315442 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.315591 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:18 crc kubenswrapper[4756]: I0318 14:03:18.367904 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:18 crc kubenswrapper[4756]: I0318 14:03:18.368026 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.368257 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.368293 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.368311 4756 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.368379 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 14:05:20.368353727 +0000 UTC m=+321.682771742 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.368257 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.368433 4756 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.368455 4756 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:03:18 crc kubenswrapper[4756]: E0318 14:03:18.368513 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 14:05:20.368493921 +0000 UTC m=+321.682911936 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 14:03:19 crc kubenswrapper[4756]: I0318 14:03:19.315005 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:19 crc kubenswrapper[4756]: E0318 14:03:19.317080 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:19 crc kubenswrapper[4756]: E0318 14:03:19.428269 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:03:20 crc kubenswrapper[4756]: I0318 14:03:20.314437 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:20 crc kubenswrapper[4756]: I0318 14:03:20.314473 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:20 crc kubenswrapper[4756]: E0318 14:03:20.314572 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:20 crc kubenswrapper[4756]: I0318 14:03:20.314609 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:20 crc kubenswrapper[4756]: E0318 14:03:20.314792 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:20 crc kubenswrapper[4756]: E0318 14:03:20.315484 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:21 crc kubenswrapper[4756]: I0318 14:03:21.315005 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:21 crc kubenswrapper[4756]: E0318 14:03:21.316107 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:22 crc kubenswrapper[4756]: I0318 14:03:22.314492 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:22 crc kubenswrapper[4756]: I0318 14:03:22.314526 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:22 crc kubenswrapper[4756]: E0318 14:03:22.314727 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:22 crc kubenswrapper[4756]: I0318 14:03:22.314530 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:22 crc kubenswrapper[4756]: E0318 14:03:22.315174 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:22 crc kubenswrapper[4756]: E0318 14:03:22.314982 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:23 crc kubenswrapper[4756]: I0318 14:03:23.315067 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:23 crc kubenswrapper[4756]: E0318 14:03:23.315308 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:23 crc kubenswrapper[4756]: I0318 14:03:23.315567 4756 scope.go:117] "RemoveContainer" containerID="a126ca0e73bc313c3b86c281a1627b238bb55f129d6ef0226b530e34d2b9d629" Mar 18 14:03:24 crc kubenswrapper[4756]: I0318 14:03:24.264423 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wz5hm_13703604-4b4e-4eb2-b311-88457b667918/kube-multus/1.log" Mar 18 14:03:24 crc kubenswrapper[4756]: I0318 14:03:24.264778 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wz5hm" event={"ID":"13703604-4b4e-4eb2-b311-88457b667918","Type":"ContainerStarted","Data":"6d271e322ff997b6b5d2c9dcc6a298d8e41b723e1cb2c048962de813499e1b54"} Mar 18 14:03:24 crc kubenswrapper[4756]: I0318 14:03:24.314863 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:24 crc kubenswrapper[4756]: I0318 14:03:24.314944 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:24 crc kubenswrapper[4756]: E0318 14:03:24.315031 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:24 crc kubenswrapper[4756]: I0318 14:03:24.314944 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:24 crc kubenswrapper[4756]: E0318 14:03:24.315229 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:24 crc kubenswrapper[4756]: E0318 14:03:24.315341 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:24 crc kubenswrapper[4756]: E0318 14:03:24.429445 4756 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:03:25 crc kubenswrapper[4756]: I0318 14:03:25.315247 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:25 crc kubenswrapper[4756]: E0318 14:03:25.315482 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:26 crc kubenswrapper[4756]: I0318 14:03:26.314590 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:26 crc kubenswrapper[4756]: I0318 14:03:26.314697 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:26 crc kubenswrapper[4756]: E0318 14:03:26.314737 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:26 crc kubenswrapper[4756]: I0318 14:03:26.314791 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:26 crc kubenswrapper[4756]: E0318 14:03:26.314907 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:26 crc kubenswrapper[4756]: E0318 14:03:26.315082 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:27 crc kubenswrapper[4756]: I0318 14:03:27.314642 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:27 crc kubenswrapper[4756]: E0318 14:03:27.314863 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:28 crc kubenswrapper[4756]: I0318 14:03:28.314348 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:28 crc kubenswrapper[4756]: I0318 14:03:28.314392 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:28 crc kubenswrapper[4756]: I0318 14:03:28.314348 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:28 crc kubenswrapper[4756]: E0318 14:03:28.314516 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 14:03:28 crc kubenswrapper[4756]: E0318 14:03:28.314642 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 14:03:28 crc kubenswrapper[4756]: E0318 14:03:28.314793 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:03:29 crc kubenswrapper[4756]: I0318 14:03:29.315199 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:29 crc kubenswrapper[4756]: E0318 14:03:29.316069 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gfdtl" podUID="c11d2088-741c-4812-8eb2-ccfc3d0c7d11" Mar 18 14:03:30 crc kubenswrapper[4756]: I0318 14:03:30.315166 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:03:30 crc kubenswrapper[4756]: I0318 14:03:30.315213 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:03:30 crc kubenswrapper[4756]: I0318 14:03:30.315383 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:03:30 crc kubenswrapper[4756]: I0318 14:03:30.318417 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 14:03:30 crc kubenswrapper[4756]: I0318 14:03:30.319479 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 14:03:30 crc kubenswrapper[4756]: I0318 14:03:30.320501 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 14:03:30 crc kubenswrapper[4756]: I0318 14:03:30.320617 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 14:03:31 crc kubenswrapper[4756]: I0318 14:03:31.314520 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:31 crc kubenswrapper[4756]: I0318 14:03:31.317632 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 14:03:31 crc kubenswrapper[4756]: I0318 14:03:31.317976 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.720106 4756 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.784839 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pvzfk"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.785713 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.787455 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.788461 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.797071 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.797112 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.797423 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.797592 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.797631 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.798090 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.800866 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.801870 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.802093 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.814790 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.815149 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.815228 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.815329 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.815515 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.815543 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.815665 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.816161 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.816462 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.817213 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rs6j9"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.817489 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.818035 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.820717 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.821070 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.821360 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.821413 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.821653 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.821699 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.822389 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.834019 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.834812 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.837304 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.837559 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.843107 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.845026 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.846463 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.860401 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.860669 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.860891 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.860940 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.860978 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.861281 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.861320 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.861966 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.862530 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.863174 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4ckd5"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.863606 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.864349 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.864780 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.864951 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9s7gz"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.865306 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.865732 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.866596 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.866685 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-95vm5"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.867000 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.867969 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.868489 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.868823 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9sqqb"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.869080 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.869581 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.869781 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-77tzj"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.869852 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.870272 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.873380 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.873607 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.873770 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.878714 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.878927 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.879086 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.879269 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.880052 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.880204 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.880599 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.882151 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-f9hzx"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.882476 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.889330 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.890340 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.897579 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.898283 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.898369 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.899007 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.899294 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9pdb4"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.899648 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.901982 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564042-6mtlx"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.902202 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.902297 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.902401 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.902633 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564042-6mtlx" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.902686 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.902945 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.903043 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.905525 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.905877 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.906375 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.906738 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qd587"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.906902 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.907200 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.907452 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.916001 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-588c8"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.916639 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.916728 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.916758 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qd587" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.917211 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.917787 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.918166 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.918338 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.918565 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.920893 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-k5xg9"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.921674 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.925305 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.925685 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.925978 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.939009 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.939717 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.941728 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.942174 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.942232 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.942826 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.951961 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2faea523-d2d3-46fd-b623-fe2cc8928c8d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2jfx5\" (UID: \"2faea523-d2d3-46fd-b623-fe2cc8928c8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952003 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cad737e-466f-4c7c-b004-c4723692f479-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rxqlj\" (UID: \"8cad737e-466f-4c7c-b004-c4723692f479\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952029 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-image-import-ca\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952047 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tcxk\" (UniqueName: \"kubernetes.io/projected/2faea523-d2d3-46fd-b623-fe2cc8928c8d-kube-api-access-2tcxk\") pod \"cluster-image-registry-operator-dc59b4c8b-2jfx5\" (UID: \"2faea523-d2d3-46fd-b623-fe2cc8928c8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952102 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7067572-3edd-48f1-a03d-7f83887ab9ca-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952190 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2faea523-d2d3-46fd-b623-fe2cc8928c8d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2jfx5\" (UID: \"2faea523-d2d3-46fd-b623-fe2cc8928c8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952210 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3985e570-6d23-4928-a018-40e9b5868b89-config\") pod \"machine-api-operator-5694c8668f-rs6j9\" (UID: \"3985e570-6d23-4928-a018-40e9b5868b89\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952235 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7067572-3edd-48f1-a03d-7f83887ab9ca-serving-cert\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952261 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8hjx\" (UniqueName: \"kubernetes.io/projected/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-kube-api-access-k8hjx\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952281 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvx9w\" (UniqueName: \"kubernetes.io/projected/3985e570-6d23-4928-a018-40e9b5868b89-kube-api-access-fvx9w\") pod \"machine-api-operator-5694c8668f-rs6j9\" (UID: \"3985e570-6d23-4928-a018-40e9b5868b89\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952296 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5676f\" (UniqueName: \"kubernetes.io/projected/c7067572-3edd-48f1-a03d-7f83887ab9ca-kube-api-access-5676f\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952312 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3985e570-6d23-4928-a018-40e9b5868b89-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rs6j9\" (UID: \"3985e570-6d23-4928-a018-40e9b5868b89\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952328 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cad737e-466f-4c7c-b004-c4723692f479-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rxqlj\" (UID: \"8cad737e-466f-4c7c-b004-c4723692f479\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952352 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-etcd-client\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952368 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952391 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-audit\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952408 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-audit-dir\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952428 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-encryption-config\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952449 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3985e570-6d23-4928-a018-40e9b5868b89-images\") pod \"machine-api-operator-5694c8668f-rs6j9\" (UID: \"3985e570-6d23-4928-a018-40e9b5868b89\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952473 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-serving-cert\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952489 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2faea523-d2d3-46fd-b623-fe2cc8928c8d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2jfx5\" (UID: \"2faea523-d2d3-46fd-b623-fe2cc8928c8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952509 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-config\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952531 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8cad737e-466f-4c7c-b004-c4723692f479-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rxqlj\" (UID: \"8cad737e-466f-4c7c-b004-c4723692f479\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952547 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7067572-3edd-48f1-a03d-7f83887ab9ca-etcd-client\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952572 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-node-pullsecrets\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952588 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7067572-3edd-48f1-a03d-7f83887ab9ca-audit-dir\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952605 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c7067572-3edd-48f1-a03d-7f83887ab9ca-audit-policies\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952624 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7067572-3edd-48f1-a03d-7f83887ab9ca-encryption-config\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952642 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-etcd-serving-ca\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952662 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7067572-3edd-48f1-a03d-7f83887ab9ca-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.952762 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.954829 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2ptxb"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.955366 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.955992 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.956206 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.962292 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.962497 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.962292 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.962313 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.962329 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.962391 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.962427 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.962973 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.963052 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.965208 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.965977 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.966926 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.967179 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.968046 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.968297 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.968386 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.968540 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.968736 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.968847 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.968969 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.969344 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.969551 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.969631 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.969762 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.969764 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.969852 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.969594 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.969945 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.970022 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.969003 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.969553 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.970103 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.970023 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.970145 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mm2qg"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.971364 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.973972 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.974197 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mm2qg" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.974573 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.975648 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.976619 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wvqhb"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.977235 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.977708 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.978166 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.978382 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.978493 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wvqhb" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.978509 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.978600 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.978623 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.978938 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dtbtw"] Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.979342 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dtbtw" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.980662 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.981105 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.982474 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 14:03:32 crc kubenswrapper[4756]: I0318 14:03:32.990707 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.013249 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.019212 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.021471 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pvzfk"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.022140 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rs6j9"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.022219 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.022288 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.021782 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.021785 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.021813 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.021981 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.022031 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.023085 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.022341 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.024505 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.024751 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4ckd5"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.026315 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9pdb4"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.026709 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.030968 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.030331 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.033049 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.033062 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.033071 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qd587"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.033310 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564042-6mtlx"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.037523 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.037559 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.039053 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9sqqb"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.040432 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-95vm5"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.041647 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tc72q"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.042488 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tc72q" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.042599 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.043957 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.044926 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vxnd9"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.045905 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vxnd9" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.046337 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-77tzj"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.047348 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9s7gz"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.047981 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.048558 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.049845 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wvqhb"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.050855 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.052194 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2ptxb"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.052964 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mm2qg"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.053914 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vxnd9"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.054953 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.056237 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-588c8"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.057486 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.058312 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3985e570-6d23-4928-a018-40e9b5868b89-config\") pod \"machine-api-operator-5694c8668f-rs6j9\" (UID: \"3985e570-6d23-4928-a018-40e9b5868b89\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.058621 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-k5xg9"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059068 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d7ef3d-2c3b-455c-9457-44441f1bfcff-serving-cert\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059106 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7067572-3edd-48f1-a03d-7f83887ab9ca-serving-cert\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059146 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d199014d-5b92-48d1-966d-30af0da2e1c2-default-certificate\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059167 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d199014d-5b92-48d1-966d-30af0da2e1c2-service-ca-bundle\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059188 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8hjx\" (UniqueName: \"kubernetes.io/projected/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-kube-api-access-k8hjx\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059222 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvx9w\" (UniqueName: \"kubernetes.io/projected/3985e570-6d23-4928-a018-40e9b5868b89-kube-api-access-fvx9w\") pod \"machine-api-operator-5694c8668f-rs6j9\" (UID: \"3985e570-6d23-4928-a018-40e9b5868b89\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059234 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3985e570-6d23-4928-a018-40e9b5868b89-config\") pod \"machine-api-operator-5694c8668f-rs6j9\" (UID: \"3985e570-6d23-4928-a018-40e9b5868b89\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059245 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5676f\" (UniqueName: \"kubernetes.io/projected/c7067572-3edd-48f1-a03d-7f83887ab9ca-kube-api-access-5676f\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059267 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2sdw\" (UniqueName: \"kubernetes.io/projected/f6816a11-7f42-40ca-9bbe-ec5f3d0d019a-kube-api-access-s2sdw\") pod \"openshift-apiserver-operator-796bbdcf4f-bxxfm\" (UID: \"f6816a11-7f42-40ca-9bbe-ec5f3d0d019a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059295 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059324 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3985e570-6d23-4928-a018-40e9b5868b89-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rs6j9\" (UID: \"3985e570-6d23-4928-a018-40e9b5868b89\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059348 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-etcd-client\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059371 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059397 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cad737e-466f-4c7c-b004-c4723692f479-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rxqlj\" (UID: \"8cad737e-466f-4c7c-b004-c4723692f479\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059432 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znvtz\" (UniqueName: \"kubernetes.io/projected/d199014d-5b92-48d1-966d-30af0da2e1c2-kube-api-access-znvtz\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059461 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-audit\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059485 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-audit-dir\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059513 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xck8l\" (UID: \"2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059539 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btwv7\" (UniqueName: \"kubernetes.io/projected/42d7ef3d-2c3b-455c-9457-44441f1bfcff-kube-api-access-btwv7\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059561 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/efc607b9-67d2-4f27-8ebc-03f067d11caf-etcd-client\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059583 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e271e583-b5d3-452e-a009-f8b21a8121d9-config\") pod \"service-ca-operator-777779d784-mxm9h\" (UID: \"e271e583-b5d3-452e-a009-f8b21a8121d9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059610 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-encryption-config\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059656 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv4dm\" (UniqueName: \"kubernetes.io/projected/efc607b9-67d2-4f27-8ebc-03f067d11caf-kube-api-access-xv4dm\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059673 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xck8l\" (UID: \"2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059691 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d199014d-5b92-48d1-966d-30af0da2e1c2-stats-auth\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059711 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-serving-cert\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059798 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-audit-dir\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059833 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2faea523-d2d3-46fd-b623-fe2cc8928c8d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2jfx5\" (UID: \"2faea523-d2d3-46fd-b623-fe2cc8928c8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059860 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q42gv\" (UniqueName: \"kubernetes.io/projected/bfa461e5-a4e9-4cfa-a279-df6d4a56c973-kube-api-access-q42gv\") pod \"auto-csr-approver-29564042-6mtlx\" (UID: \"bfa461e5-a4e9-4cfa-a279-df6d4a56c973\") " pod="openshift-infra/auto-csr-approver-29564042-6mtlx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059885 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3985e570-6d23-4928-a018-40e9b5868b89-images\") pod \"machine-api-operator-5694c8668f-rs6j9\" (UID: \"3985e570-6d23-4928-a018-40e9b5868b89\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.059901 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efc607b9-67d2-4f27-8ebc-03f067d11caf-serving-cert\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.060190 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.060317 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-audit\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.060749 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-config\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.060846 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/efc607b9-67d2-4f27-8ebc-03f067d11caf-etcd-service-ca\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.060874 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdswk\" (UniqueName: \"kubernetes.io/projected/c7298879-219c-4329-a9e6-1854fc22b1d9-kube-api-access-wdswk\") pod \"packageserver-d55dfcdfc-kld5c\" (UID: \"c7298879-219c-4329-a9e6-1854fc22b1d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.060897 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efc607b9-67d2-4f27-8ebc-03f067d11caf-config\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.060922 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-config\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.060985 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-client-ca\") pod \"route-controller-manager-6576b87f9c-6mb6v\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.061012 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6816a11-7f42-40ca-9bbe-ec5f3d0d019a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bxxfm\" (UID: \"f6816a11-7f42-40ca-9bbe-ec5f3d0d019a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.061033 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8cad737e-466f-4c7c-b004-c4723692f479-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rxqlj\" (UID: \"8cad737e-466f-4c7c-b004-c4723692f479\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.061049 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-serving-cert\") pod \"route-controller-manager-6576b87f9c-6mb6v\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.061066 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7067572-3edd-48f1-a03d-7f83887ab9ca-etcd-client\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.061104 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c7298879-219c-4329-a9e6-1854fc22b1d9-apiservice-cert\") pod \"packageserver-d55dfcdfc-kld5c\" (UID: \"c7298879-219c-4329-a9e6-1854fc22b1d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.061137 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-node-pullsecrets\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.061154 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7067572-3edd-48f1-a03d-7f83887ab9ca-audit-dir\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.061171 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c7298879-219c-4329-a9e6-1854fc22b1d9-tmpfs\") pod \"packageserver-d55dfcdfc-kld5c\" (UID: \"c7298879-219c-4329-a9e6-1854fc22b1d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.061197 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c7067572-3edd-48f1-a03d-7f83887ab9ca-audit-policies\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.061251 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3985e570-6d23-4928-a018-40e9b5868b89-images\") pod \"machine-api-operator-5694c8668f-rs6j9\" (UID: \"3985e570-6d23-4928-a018-40e9b5868b89\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.061881 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.062024 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-config\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.062229 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7067572-3edd-48f1-a03d-7f83887ab9ca-audit-dir\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.062281 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-node-pullsecrets\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.062515 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-etcd-serving-ca\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.062594 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7067572-3edd-48f1-a03d-7f83887ab9ca-encryption-config\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.062752 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c7067572-3edd-48f1-a03d-7f83887ab9ca-audit-policies\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.063255 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-etcd-serving-ca\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.063343 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e271e583-b5d3-452e-a009-f8b21a8121d9-serving-cert\") pod \"service-ca-operator-777779d784-mxm9h\" (UID: \"e271e583-b5d3-452e-a009-f8b21a8121d9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.063384 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7067572-3edd-48f1-a03d-7f83887ab9ca-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.063409 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c7298879-219c-4329-a9e6-1854fc22b1d9-webhook-cert\") pod \"packageserver-d55dfcdfc-kld5c\" (UID: \"c7298879-219c-4329-a9e6-1854fc22b1d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.063450 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-client-ca\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.063925 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7067572-3edd-48f1-a03d-7f83887ab9ca-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.063991 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4-config\") pod \"kube-controller-manager-operator-78b949d7b-xck8l\" (UID: \"2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.064052 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-config\") pod \"route-controller-manager-6576b87f9c-6mb6v\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.064086 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2faea523-d2d3-46fd-b623-fe2cc8928c8d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2jfx5\" (UID: \"2faea523-d2d3-46fd-b623-fe2cc8928c8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.064358 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7067572-3edd-48f1-a03d-7f83887ab9ca-etcd-client\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.064687 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-encryption-config\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.064922 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.064955 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.065202 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2faea523-d2d3-46fd-b623-fe2cc8928c8d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2jfx5\" (UID: \"2faea523-d2d3-46fd-b623-fe2cc8928c8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.065283 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-image-import-ca\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.065333 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tcxk\" (UniqueName: \"kubernetes.io/projected/2faea523-d2d3-46fd-b623-fe2cc8928c8d-kube-api-access-2tcxk\") pod \"cluster-image-registry-operator-dc59b4c8b-2jfx5\" (UID: \"2faea523-d2d3-46fd-b623-fe2cc8928c8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.065349 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-etcd-client\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.065358 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cad737e-466f-4c7c-b004-c4723692f479-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rxqlj\" (UID: \"8cad737e-466f-4c7c-b004-c4723692f479\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.065414 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3985e570-6d23-4928-a018-40e9b5868b89-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rs6j9\" (UID: \"3985e570-6d23-4928-a018-40e9b5868b89\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.065507 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6816a11-7f42-40ca-9bbe-ec5f3d0d019a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bxxfm\" (UID: \"f6816a11-7f42-40ca-9bbe-ec5f3d0d019a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.065494 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-serving-cert\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.065565 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/efc607b9-67d2-4f27-8ebc-03f067d11caf-etcd-ca\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.065663 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7067572-3edd-48f1-a03d-7f83887ab9ca-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.065906 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cad737e-466f-4c7c-b004-c4723692f479-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rxqlj\" (UID: \"8cad737e-466f-4c7c-b004-c4723692f479\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.065975 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22sd5\" (UniqueName: \"kubernetes.io/projected/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-kube-api-access-22sd5\") pod \"route-controller-manager-6576b87f9c-6mb6v\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.066012 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgq6j\" (UniqueName: \"kubernetes.io/projected/e271e583-b5d3-452e-a009-f8b21a8121d9-kube-api-access-hgq6j\") pod \"service-ca-operator-777779d784-mxm9h\" (UID: \"e271e583-b5d3-452e-a009-f8b21a8121d9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.066008 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.066048 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2faea523-d2d3-46fd-b623-fe2cc8928c8d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2jfx5\" (UID: \"2faea523-d2d3-46fd-b623-fe2cc8928c8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.066080 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d199014d-5b92-48d1-966d-30af0da2e1c2-metrics-certs\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.066166 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7067572-3edd-48f1-a03d-7f83887ab9ca-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.066374 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-image-import-ca\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.066544 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tc72q"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.067081 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7067572-3edd-48f1-a03d-7f83887ab9ca-serving-cert\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.067144 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.067082 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cad737e-466f-4c7c-b004-c4723692f479-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rxqlj\" (UID: \"8cad737e-466f-4c7c-b004-c4723692f479\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.068066 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7067572-3edd-48f1-a03d-7f83887ab9ca-encryption-config\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.068106 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dtbtw"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.069246 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-p9mcq"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.069250 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2faea523-d2d3-46fd-b623-fe2cc8928c8d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2jfx5\" (UID: \"2faea523-d2d3-46fd-b623-fe2cc8928c8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.071175 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9klwh"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.071294 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.072075 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-p9mcq"] Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.072173 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9klwh" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.088433 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.107178 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.127461 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.147678 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.166820 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btwv7\" (UniqueName: \"kubernetes.io/projected/42d7ef3d-2c3b-455c-9457-44441f1bfcff-kube-api-access-btwv7\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.166855 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/efc607b9-67d2-4f27-8ebc-03f067d11caf-etcd-client\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.166875 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e271e583-b5d3-452e-a009-f8b21a8121d9-config\") pod \"service-ca-operator-777779d784-mxm9h\" (UID: \"e271e583-b5d3-452e-a009-f8b21a8121d9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.166895 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xck8l\" (UID: \"2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.166910 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv4dm\" (UniqueName: \"kubernetes.io/projected/efc607b9-67d2-4f27-8ebc-03f067d11caf-kube-api-access-xv4dm\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.166928 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d199014d-5b92-48d1-966d-30af0da2e1c2-stats-auth\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.166943 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xck8l\" (UID: \"2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.166965 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q42gv\" (UniqueName: \"kubernetes.io/projected/bfa461e5-a4e9-4cfa-a279-df6d4a56c973-kube-api-access-q42gv\") pod \"auto-csr-approver-29564042-6mtlx\" (UID: \"bfa461e5-a4e9-4cfa-a279-df6d4a56c973\") " pod="openshift-infra/auto-csr-approver-29564042-6mtlx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.166980 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efc607b9-67d2-4f27-8ebc-03f067d11caf-serving-cert\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167000 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-config\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167018 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/efc607b9-67d2-4f27-8ebc-03f067d11caf-etcd-service-ca\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167035 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efc607b9-67d2-4f27-8ebc-03f067d11caf-config\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167052 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdswk\" (UniqueName: \"kubernetes.io/projected/c7298879-219c-4329-a9e6-1854fc22b1d9-kube-api-access-wdswk\") pod \"packageserver-d55dfcdfc-kld5c\" (UID: \"c7298879-219c-4329-a9e6-1854fc22b1d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167071 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6816a11-7f42-40ca-9bbe-ec5f3d0d019a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bxxfm\" (UID: \"f6816a11-7f42-40ca-9bbe-ec5f3d0d019a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167093 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-client-ca\") pod \"route-controller-manager-6576b87f9c-6mb6v\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167107 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c7298879-219c-4329-a9e6-1854fc22b1d9-apiservice-cert\") pod \"packageserver-d55dfcdfc-kld5c\" (UID: \"c7298879-219c-4329-a9e6-1854fc22b1d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167142 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-serving-cert\") pod \"route-controller-manager-6576b87f9c-6mb6v\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167157 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c7298879-219c-4329-a9e6-1854fc22b1d9-tmpfs\") pod \"packageserver-d55dfcdfc-kld5c\" (UID: \"c7298879-219c-4329-a9e6-1854fc22b1d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167184 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e271e583-b5d3-452e-a009-f8b21a8121d9-serving-cert\") pod \"service-ca-operator-777779d784-mxm9h\" (UID: \"e271e583-b5d3-452e-a009-f8b21a8121d9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167199 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c7298879-219c-4329-a9e6-1854fc22b1d9-webhook-cert\") pod \"packageserver-d55dfcdfc-kld5c\" (UID: \"c7298879-219c-4329-a9e6-1854fc22b1d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167214 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-client-ca\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167258 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4-config\") pod \"kube-controller-manager-operator-78b949d7b-xck8l\" (UID: \"2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167275 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-config\") pod \"route-controller-manager-6576b87f9c-6mb6v\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167298 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6816a11-7f42-40ca-9bbe-ec5f3d0d019a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bxxfm\" (UID: \"f6816a11-7f42-40ca-9bbe-ec5f3d0d019a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167314 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/efc607b9-67d2-4f27-8ebc-03f067d11caf-etcd-ca\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167316 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167340 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgq6j\" (UniqueName: \"kubernetes.io/projected/e271e583-b5d3-452e-a009-f8b21a8121d9-kube-api-access-hgq6j\") pod \"service-ca-operator-777779d784-mxm9h\" (UID: \"e271e583-b5d3-452e-a009-f8b21a8121d9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167358 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d199014d-5b92-48d1-966d-30af0da2e1c2-metrics-certs\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167377 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22sd5\" (UniqueName: \"kubernetes.io/projected/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-kube-api-access-22sd5\") pod \"route-controller-manager-6576b87f9c-6mb6v\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167393 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d7ef3d-2c3b-455c-9457-44441f1bfcff-serving-cert\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167408 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d199014d-5b92-48d1-966d-30af0da2e1c2-default-certificate\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167426 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d199014d-5b92-48d1-966d-30af0da2e1c2-service-ca-bundle\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167469 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2sdw\" (UniqueName: \"kubernetes.io/projected/f6816a11-7f42-40ca-9bbe-ec5f3d0d019a-kube-api-access-s2sdw\") pod \"openshift-apiserver-operator-796bbdcf4f-bxxfm\" (UID: \"f6816a11-7f42-40ca-9bbe-ec5f3d0d019a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167485 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.167542 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znvtz\" (UniqueName: \"kubernetes.io/projected/d199014d-5b92-48d1-966d-30af0da2e1c2-kube-api-access-znvtz\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.168367 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-config\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.168752 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c7298879-219c-4329-a9e6-1854fc22b1d9-tmpfs\") pod \"packageserver-d55dfcdfc-kld5c\" (UID: \"c7298879-219c-4329-a9e6-1854fc22b1d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.169183 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4-config\") pod \"kube-controller-manager-operator-78b949d7b-xck8l\" (UID: \"2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.169466 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d199014d-5b92-48d1-966d-30af0da2e1c2-service-ca-bundle\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.169631 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.169675 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-client-ca\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.170431 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xck8l\" (UID: \"2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.170945 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d199014d-5b92-48d1-966d-30af0da2e1c2-stats-auth\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.171019 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c7298879-219c-4329-a9e6-1854fc22b1d9-webhook-cert\") pod \"packageserver-d55dfcdfc-kld5c\" (UID: \"c7298879-219c-4329-a9e6-1854fc22b1d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.171147 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d199014d-5b92-48d1-966d-30af0da2e1c2-metrics-certs\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.171273 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c7298879-219c-4329-a9e6-1854fc22b1d9-apiservice-cert\") pod \"packageserver-d55dfcdfc-kld5c\" (UID: \"c7298879-219c-4329-a9e6-1854fc22b1d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.172736 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d7ef3d-2c3b-455c-9457-44441f1bfcff-serving-cert\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.175335 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d199014d-5b92-48d1-966d-30af0da2e1c2-default-certificate\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.189105 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.208034 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.228060 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.249846 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.268140 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.280868 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6816a11-7f42-40ca-9bbe-ec5f3d0d019a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bxxfm\" (UID: \"f6816a11-7f42-40ca-9bbe-ec5f3d0d019a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.288253 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.292189 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-serving-cert\") pod \"route-controller-manager-6576b87f9c-6mb6v\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.308146 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.318960 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6816a11-7f42-40ca-9bbe-ec5f3d0d019a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bxxfm\" (UID: \"f6816a11-7f42-40ca-9bbe-ec5f3d0d019a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.329145 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.340369 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-config\") pod \"route-controller-manager-6576b87f9c-6mb6v\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.348618 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.369286 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.388716 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.398490 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-client-ca\") pod \"route-controller-manager-6576b87f9c-6mb6v\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.408495 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.427580 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.448884 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.461004 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efc607b9-67d2-4f27-8ebc-03f067d11caf-serving-cert\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.468922 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.481149 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/efc607b9-67d2-4f27-8ebc-03f067d11caf-etcd-client\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.489149 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.507921 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.518536 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efc607b9-67d2-4f27-8ebc-03f067d11caf-config\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.529771 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.539264 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/efc607b9-67d2-4f27-8ebc-03f067d11caf-etcd-ca\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.549692 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.558476 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/efc607b9-67d2-4f27-8ebc-03f067d11caf-etcd-service-ca\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.569159 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.588879 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.608597 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.627731 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.649012 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.668892 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.689345 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.708763 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.729575 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.751908 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.768939 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.788060 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.807587 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.837683 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.849260 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.869386 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.888947 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.908861 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.926664 4756 request.go:700] Waited for 1.004359072s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.929181 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.948995 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.967531 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 14:03:33 crc kubenswrapper[4756]: I0318 14:03:33.990779 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.008372 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.028696 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.048768 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.053045 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e271e583-b5d3-452e-a009-f8b21a8121d9-serving-cert\") pod \"service-ca-operator-777779d784-mxm9h\" (UID: \"e271e583-b5d3-452e-a009-f8b21a8121d9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.068406 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.078291 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e271e583-b5d3-452e-a009-f8b21a8121d9-config\") pod \"service-ca-operator-777779d784-mxm9h\" (UID: \"e271e583-b5d3-452e-a009-f8b21a8121d9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.088358 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.108383 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.128960 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.147828 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.167917 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.228968 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.249406 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.269196 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.289358 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.309100 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.329090 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.348825 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.368294 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.393693 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.408648 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.428571 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.449603 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.468770 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.489831 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.508727 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.528930 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.548304 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.568737 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.588069 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.608842 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.628858 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.649086 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.669903 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.728391 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.732344 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.733843 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.748543 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.788712 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8hjx\" (UniqueName: \"kubernetes.io/projected/5f7750ad-9fab-43ab-bcf2-fc8eaa797013-kube-api-access-k8hjx\") pod \"apiserver-76f77b778f-pvzfk\" (UID: \"5f7750ad-9fab-43ab-bcf2-fc8eaa797013\") " pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.806421 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvx9w\" (UniqueName: \"kubernetes.io/projected/3985e570-6d23-4928-a018-40e9b5868b89-kube-api-access-fvx9w\") pod \"machine-api-operator-5694c8668f-rs6j9\" (UID: \"3985e570-6d23-4928-a018-40e9b5868b89\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.833596 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5676f\" (UniqueName: \"kubernetes.io/projected/c7067572-3edd-48f1-a03d-7f83887ab9ca-kube-api-access-5676f\") pod \"apiserver-7bbb656c7d-j87k5\" (UID: \"c7067572-3edd-48f1-a03d-7f83887ab9ca\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.841662 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2faea523-d2d3-46fd-b623-fe2cc8928c8d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2jfx5\" (UID: \"2faea523-d2d3-46fd-b623-fe2cc8928c8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.861470 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8cad737e-466f-4c7c-b004-c4723692f479-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rxqlj\" (UID: \"8cad737e-466f-4c7c-b004-c4723692f479\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.884935 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tcxk\" (UniqueName: \"kubernetes.io/projected/2faea523-d2d3-46fd-b623-fe2cc8928c8d-kube-api-access-2tcxk\") pod \"cluster-image-registry-operator-dc59b4c8b-2jfx5\" (UID: \"2faea523-d2d3-46fd-b623-fe2cc8928c8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.888808 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.908220 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.910076 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.920686 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.927025 4756 request.go:700] Waited for 1.8550456s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.928755 4756 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.936218 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.948104 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.963180 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.969931 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.971072 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" Mar 18 14:03:34 crc kubenswrapper[4756]: I0318 14:03:34.989349 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.025047 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btwv7\" (UniqueName: \"kubernetes.io/projected/42d7ef3d-2c3b-455c-9457-44441f1bfcff-kube-api-access-btwv7\") pod \"controller-manager-879f6c89f-95vm5\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.052801 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q42gv\" (UniqueName: \"kubernetes.io/projected/bfa461e5-a4e9-4cfa-a279-df6d4a56c973-kube-api-access-q42gv\") pod \"auto-csr-approver-29564042-6mtlx\" (UID: \"bfa461e5-a4e9-4cfa-a279-df6d4a56c973\") " pod="openshift-infra/auto-csr-approver-29564042-6mtlx" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.071189 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xck8l\" (UID: \"2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.088042 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdswk\" (UniqueName: \"kubernetes.io/projected/c7298879-219c-4329-a9e6-1854fc22b1d9-kube-api-access-wdswk\") pod \"packageserver-d55dfcdfc-kld5c\" (UID: \"c7298879-219c-4329-a9e6-1854fc22b1d9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.103551 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22sd5\" (UniqueName: \"kubernetes.io/projected/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-kube-api-access-22sd5\") pod \"route-controller-manager-6576b87f9c-6mb6v\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.134742 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znvtz\" (UniqueName: \"kubernetes.io/projected/d199014d-5b92-48d1-966d-30af0da2e1c2-kube-api-access-znvtz\") pod \"router-default-5444994796-f9hzx\" (UID: \"d199014d-5b92-48d1-966d-30af0da2e1c2\") " pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.141913 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgq6j\" (UniqueName: \"kubernetes.io/projected/e271e583-b5d3-452e-a009-f8b21a8121d9-kube-api-access-hgq6j\") pod \"service-ca-operator-777779d784-mxm9h\" (UID: \"e271e583-b5d3-452e-a009-f8b21a8121d9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.152290 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.161712 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv4dm\" (UniqueName: \"kubernetes.io/projected/efc607b9-67d2-4f27-8ebc-03f067d11caf-kube-api-access-xv4dm\") pod \"etcd-operator-b45778765-588c8\" (UID: \"efc607b9-67d2-4f27-8ebc-03f067d11caf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.188369 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.190684 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rs6j9"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.190966 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2sdw\" (UniqueName: \"kubernetes.io/projected/f6816a11-7f42-40ca-9bbe-ec5f3d0d019a-kube-api-access-s2sdw\") pod \"openshift-apiserver-operator-796bbdcf4f-bxxfm\" (UID: \"f6816a11-7f42-40ca-9bbe-ec5f3d0d019a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193010 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193052 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/37777140-c5a3-4503-abbd-0d316a535630-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kgtbj\" (UID: \"37777140-c5a3-4503-abbd-0d316a535630\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193084 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3a78872-6429-4e96-af97-024c68c537d7-service-ca-bundle\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193142 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rm8qz\" (UID: \"cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193177 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ht8w\" (UniqueName: \"kubernetes.io/projected/d3a78872-6429-4e96-af97-024c68c537d7-kube-api-access-8ht8w\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193206 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/487d1c97-b703-4f1b-8c77-c23b4366a467-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4ckd5\" (UID: \"487d1c97-b703-4f1b-8c77-c23b4366a467\") " pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193238 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7c7f16d-c4d5-4a88-ac61-6de2735da80c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kxcdf\" (UID: \"f7c7f16d-c4d5-4a88-ac61-6de2735da80c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193267 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193299 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a15d24cf-4182-44bb-9d60-33649137cc83-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193327 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-trusted-ca-bundle\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193354 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-registry-tls\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193385 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7nl8\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-kube-api-access-w7nl8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193415 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfd097dc-6461-4bf5-8952-2a0ebd249f35-trusted-ca\") pod \"console-operator-58897d9998-77tzj\" (UID: \"cfd097dc-6461-4bf5-8952-2a0ebd249f35\") " pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193447 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-bound-sa-token\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193498 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhxmv\" (UniqueName: \"kubernetes.io/projected/e79e3bba-d970-4daf-983d-aa4894129506-kube-api-access-fhxmv\") pod \"migrator-59844c95c7-qd587\" (UID: \"e79e3bba-d970-4daf-983d-aa4894129506\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qd587" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193530 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b93428b-357a-4f73-ba6f-c46a4c475a98-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xktt\" (UID: \"9b93428b-357a-4f73-ba6f-c46a4c475a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193563 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37777140-c5a3-4503-abbd-0d316a535630-serving-cert\") pod \"openshift-config-operator-7777fb866f-kgtbj\" (UID: \"37777140-c5a3-4503-abbd-0d316a535630\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193610 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkjw2\" (UniqueName: \"kubernetes.io/projected/b811256d-cc06-4ab8-a482-5ea91528ad95-kube-api-access-mkjw2\") pod \"cluster-samples-operator-665b6dd947-prmvd\" (UID: \"b811256d-cc06-4ab8-a482-5ea91528ad95\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193640 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67423a72-d377-414a-b293-56f824df4d45-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2gbt5\" (UID: \"67423a72-d377-414a-b293-56f824df4d45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193698 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/259f3fb2-856e-4885-8955-6703997715f9-signing-key\") pod \"service-ca-9c57cc56f-2ptxb\" (UID: \"259f3fb2-856e-4885-8955-6703997715f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193728 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfd097dc-6461-4bf5-8952-2a0ebd249f35-serving-cert\") pod \"console-operator-58897d9998-77tzj\" (UID: \"cfd097dc-6461-4bf5-8952-2a0ebd249f35\") " pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193758 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j7xj\" (UniqueName: \"kubernetes.io/projected/cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c-kube-api-access-6j7xj\") pod \"package-server-manager-789f6589d5-rm8qz\" (UID: \"cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193786 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b811256d-cc06-4ab8-a482-5ea91528ad95-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-prmvd\" (UID: \"b811256d-cc06-4ab8-a482-5ea91528ad95\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193818 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d01fc0fa-9be1-4493-be47-c420ebb5341d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-49g5j\" (UID: \"d01fc0fa-9be1-4493-be47-c420ebb5341d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193848 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx8lv\" (UniqueName: \"kubernetes.io/projected/f88a8bdd-954f-455c-aad1-03b1988afa37-kube-api-access-dx8lv\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193881 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q52c\" (UniqueName: \"kubernetes.io/projected/259f3fb2-856e-4885-8955-6703997715f9-kube-api-access-5q52c\") pod \"service-ca-9c57cc56f-2ptxb\" (UID: \"259f3fb2-856e-4885-8955-6703997715f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193913 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f7c7f16d-c4d5-4a88-ac61-6de2735da80c-images\") pod \"machine-config-operator-74547568cd-kxcdf\" (UID: \"f7c7f16d-c4d5-4a88-ac61-6de2735da80c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193945 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a15d24cf-4182-44bb-9d60-33649137cc83-registry-certificates\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.193972 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx8mb\" (UniqueName: \"kubernetes.io/projected/37777140-c5a3-4503-abbd-0d316a535630-kube-api-access-cx8mb\") pod \"openshift-config-operator-7777fb866f-kgtbj\" (UID: \"37777140-c5a3-4503-abbd-0d316a535630\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194005 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/487d1c97-b703-4f1b-8c77-c23b4366a467-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4ckd5\" (UID: \"487d1c97-b703-4f1b-8c77-c23b4366a467\") " pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194052 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f88a8bdd-954f-455c-aad1-03b1988afa37-console-oauth-config\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194082 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-oauth-serving-cert\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194109 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/259f3fb2-856e-4885-8955-6703997715f9-signing-cabundle\") pod \"service-ca-9c57cc56f-2ptxb\" (UID: \"259f3fb2-856e-4885-8955-6703997715f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194165 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f88a8bdd-954f-455c-aad1-03b1988afa37-console-serving-cert\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194214 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194247 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd097dc-6461-4bf5-8952-2a0ebd249f35-config\") pod \"console-operator-58897d9998-77tzj\" (UID: \"cfd097dc-6461-4bf5-8952-2a0ebd249f35\") " pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194274 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a57b45d3-c455-4b6c-a56e-6884e65670a0-srv-cert\") pod \"catalog-operator-68c6474976-hp2sc\" (UID: \"a57b45d3-c455-4b6c-a56e-6884e65670a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194301 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a15d24cf-4182-44bb-9d60-33649137cc83-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194344 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194377 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjgd5\" (UniqueName: \"kubernetes.io/projected/487d1c97-b703-4f1b-8c77-c23b4366a467-kube-api-access-zjgd5\") pod \"marketplace-operator-79b997595-4ckd5\" (UID: \"487d1c97-b703-4f1b-8c77-c23b4366a467\") " pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194415 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86tcb\" (UniqueName: \"kubernetes.io/projected/9b93428b-357a-4f73-ba6f-c46a4c475a98-kube-api-access-86tcb\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xktt\" (UID: \"9b93428b-357a-4f73-ba6f-c46a4c475a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194448 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3a78872-6429-4e96-af97-024c68c537d7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194486 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np5tk\" (UniqueName: \"kubernetes.io/projected/d3f5ac66-56e5-4477-ac1b-1ef496242243-kube-api-access-np5tk\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194691 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67ea2f51-e62b-497a-98dc-0835647d0d5f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-54jrv\" (UID: \"67ea2f51-e62b-497a-98dc-0835647d0d5f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194740 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194769 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf9xd\" (UniqueName: \"kubernetes.io/projected/a57b45d3-c455-4b6c-a56e-6884e65670a0-kube-api-access-jf9xd\") pod \"catalog-operator-68c6474976-hp2sc\" (UID: \"a57b45d3-c455-4b6c-a56e-6884e65670a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194795 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3a78872-6429-4e96-af97-024c68c537d7-config\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194818 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194839 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d01fc0fa-9be1-4493-be47-c420ebb5341d-metrics-tls\") pod \"ingress-operator-5b745b69d9-49g5j\" (UID: \"d01fc0fa-9be1-4493-be47-c420ebb5341d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194863 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a15d24cf-4182-44bb-9d60-33649137cc83-trusted-ca\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: E0318 14:03:35.194895 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:35.694877233 +0000 UTC m=+217.009295308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.194930 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3f5ac66-56e5-4477-ac1b-1ef496242243-audit-dir\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.195268 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7c7f16d-c4d5-4a88-ac61-6de2735da80c-proxy-tls\") pod \"machine-config-operator-74547568cd-kxcdf\" (UID: \"f7c7f16d-c4d5-4a88-ac61-6de2735da80c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.195302 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67ea2f51-e62b-497a-98dc-0835647d0d5f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-54jrv\" (UID: \"67ea2f51-e62b-497a-98dc-0835647d0d5f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.195380 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d01fc0fa-9be1-4493-be47-c420ebb5341d-trusted-ca\") pod \"ingress-operator-5b745b69d9-49g5j\" (UID: \"d01fc0fa-9be1-4493-be47-c420ebb5341d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.195432 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.195464 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a57b45d3-c455-4b6c-a56e-6884e65670a0-profile-collector-cert\") pod \"catalog-operator-68c6474976-hp2sc\" (UID: \"a57b45d3-c455-4b6c-a56e-6884e65670a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.195532 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2fck\" (UniqueName: \"kubernetes.io/projected/d01fc0fa-9be1-4493-be47-c420ebb5341d-kube-api-access-f2fck\") pod \"ingress-operator-5b745b69d9-49g5j\" (UID: \"d01fc0fa-9be1-4493-be47-c420ebb5341d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.195563 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5qbk\" (UniqueName: \"kubernetes.io/projected/cfd097dc-6461-4bf5-8952-2a0ebd249f35-kube-api-access-z5qbk\") pod \"console-operator-58897d9998-77tzj\" (UID: \"cfd097dc-6461-4bf5-8952-2a0ebd249f35\") " pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.195618 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.195796 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.195924 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67423a72-d377-414a-b293-56f824df4d45-config\") pod \"kube-apiserver-operator-766d6c64bb-2gbt5\" (UID: \"67423a72-d377-414a-b293-56f824df4d45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.196086 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq9vw\" (UniqueName: \"kubernetes.io/projected/67ea2f51-e62b-497a-98dc-0835647d0d5f-kube-api-access-xq9vw\") pod \"openshift-controller-manager-operator-756b6f6bc6-54jrv\" (UID: \"67ea2f51-e62b-497a-98dc-0835647d0d5f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.196152 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67423a72-d377-414a-b293-56f824df4d45-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2gbt5\" (UID: \"67423a72-d377-414a-b293-56f824df4d45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.196259 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.196373 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.196416 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.196455 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b93428b-357a-4f73-ba6f-c46a4c475a98-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xktt\" (UID: \"9b93428b-357a-4f73-ba6f-c46a4c475a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.196513 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3a78872-6429-4e96-af97-024c68c537d7-serving-cert\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.196547 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.196676 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7m59\" (UniqueName: \"kubernetes.io/projected/f7c7f16d-c4d5-4a88-ac61-6de2735da80c-kube-api-access-b7m59\") pod \"machine-config-operator-74547568cd-kxcdf\" (UID: \"f7c7f16d-c4d5-4a88-ac61-6de2735da80c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.196730 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-console-config\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.196770 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-service-ca\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.196835 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-audit-policies\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: W0318 14:03:35.202288 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3985e570_6d23_4928_a018_40e9b5868b89.slice/crio-eccce61de3eb34244a7a41551c52ffb01f8e6a663546192ac5992fa6ca2037ad WatchSource:0}: Error finding container eccce61de3eb34244a7a41551c52ffb01f8e6a663546192ac5992fa6ca2037ad: Status 404 returned error can't find the container with id eccce61de3eb34244a7a41551c52ffb01f8e6a663546192ac5992fa6ca2037ad Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.202467 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.233246 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5"] Mar 18 14:03:35 crc kubenswrapper[4756]: W0318 14:03:35.248070 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2faea523_d2d3_46fd_b623_fe2cc8928c8d.slice/crio-74fa39d2648a05a8e76076823a1c54c34401c70f6ab412dce1f7cdbcc2efce92 WatchSource:0}: Error finding container 74fa39d2648a05a8e76076823a1c54c34401c70f6ab412dce1f7cdbcc2efce92: Status 404 returned error can't find the container with id 74fa39d2648a05a8e76076823a1c54c34401c70f6ab412dce1f7cdbcc2efce92 Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.255296 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564042-6mtlx" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.263655 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.271092 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.296692 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.297674 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.297942 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b93428b-357a-4f73-ba6f-c46a4c475a98-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xktt\" (UID: \"9b93428b-357a-4f73-ba6f-c46a4c475a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.297987 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/be0a5a79-545f-4411-8cb1-d9de4e87d983-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-br5r4\" (UID: \"be0a5a79-545f-4411-8cb1-d9de4e87d983\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298046 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37777140-c5a3-4503-abbd-0d316a535630-serving-cert\") pod \"openshift-config-operator-7777fb866f-kgtbj\" (UID: \"37777140-c5a3-4503-abbd-0d316a535630\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298073 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkjw2\" (UniqueName: \"kubernetes.io/projected/b811256d-cc06-4ab8-a482-5ea91528ad95-kube-api-access-mkjw2\") pod \"cluster-samples-operator-665b6dd947-prmvd\" (UID: \"b811256d-cc06-4ab8-a482-5ea91528ad95\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298103 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2377525-7cea-46ad-821e-834cbb9b9591-metrics-tls\") pod \"dns-operator-744455d44c-wvqhb\" (UID: \"e2377525-7cea-46ad-821e-834cbb9b9591\") " pod="openshift-dns-operator/dns-operator-744455d44c-wvqhb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298148 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfd097dc-6461-4bf5-8952-2a0ebd249f35-serving-cert\") pod \"console-operator-58897d9998-77tzj\" (UID: \"cfd097dc-6461-4bf5-8952-2a0ebd249f35\") " pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298173 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67423a72-d377-414a-b293-56f824df4d45-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2gbt5\" (UID: \"67423a72-d377-414a-b293-56f824df4d45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298195 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/259f3fb2-856e-4885-8955-6703997715f9-signing-key\") pod \"service-ca-9c57cc56f-2ptxb\" (UID: \"259f3fb2-856e-4885-8955-6703997715f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298218 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7de9ef01-2f54-487b-b0f6-33585560ee17-srv-cert\") pod \"olm-operator-6b444d44fb-ml5m2\" (UID: \"7de9ef01-2f54-487b-b0f6-33585560ee17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298242 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j7xj\" (UniqueName: \"kubernetes.io/projected/cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c-kube-api-access-6j7xj\") pod \"package-server-manager-789f6589d5-rm8qz\" (UID: \"cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298264 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b811256d-cc06-4ab8-a482-5ea91528ad95-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-prmvd\" (UID: \"b811256d-cc06-4ab8-a482-5ea91528ad95\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298289 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d01fc0fa-9be1-4493-be47-c420ebb5341d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-49g5j\" (UID: \"d01fc0fa-9be1-4493-be47-c420ebb5341d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298314 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx8lv\" (UniqueName: \"kubernetes.io/projected/f88a8bdd-954f-455c-aad1-03b1988afa37-kube-api-access-dx8lv\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298336 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q52c\" (UniqueName: \"kubernetes.io/projected/259f3fb2-856e-4885-8955-6703997715f9-kube-api-access-5q52c\") pod \"service-ca-9c57cc56f-2ptxb\" (UID: \"259f3fb2-856e-4885-8955-6703997715f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298382 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f7c7f16d-c4d5-4a88-ac61-6de2735da80c-images\") pod \"machine-config-operator-74547568cd-kxcdf\" (UID: \"f7c7f16d-c4d5-4a88-ac61-6de2735da80c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298407 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31db1d26-2d05-489e-9177-580797f8897c-config-volume\") pod \"collect-profiles-29564040-5nxvw\" (UID: \"31db1d26-2d05-489e-9177-580797f8897c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298431 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a15d24cf-4182-44bb-9d60-33649137cc83-registry-certificates\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298454 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8mb\" (UniqueName: \"kubernetes.io/projected/37777140-c5a3-4503-abbd-0d316a535630-kube-api-access-cx8mb\") pod \"openshift-config-operator-7777fb866f-kgtbj\" (UID: \"37777140-c5a3-4503-abbd-0d316a535630\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298507 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/487d1c97-b703-4f1b-8c77-c23b4366a467-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4ckd5\" (UID: \"487d1c97-b703-4f1b-8c77-c23b4366a467\") " pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298532 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5783835a-afb1-41e5-b868-2fc94a44b7f6-metrics-tls\") pod \"dns-default-vxnd9\" (UID: \"5783835a-afb1-41e5-b868-2fc94a44b7f6\") " pod="openshift-dns/dns-default-vxnd9" Mar 18 14:03:35 crc kubenswrapper[4756]: E0318 14:03:35.298567 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:35.798542875 +0000 UTC m=+217.112960850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298630 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx2ch\" (UniqueName: \"kubernetes.io/projected/6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a-kube-api-access-sx2ch\") pod \"machine-config-server-9klwh\" (UID: \"6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a\") " pod="openshift-machine-config-operator/machine-config-server-9klwh" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298679 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21001d0e-8d77-4fa1-81b4-7867be45f92a-auth-proxy-config\") pod \"machine-approver-56656f9798-npgj8\" (UID: \"21001d0e-8d77-4fa1-81b4-7867be45f92a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298711 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f88a8bdd-954f-455c-aad1-03b1988afa37-console-oauth-config\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298739 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-oauth-serving-cert\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298764 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/259f3fb2-856e-4885-8955-6703997715f9-signing-cabundle\") pod \"service-ca-9c57cc56f-2ptxb\" (UID: \"259f3fb2-856e-4885-8955-6703997715f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298791 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f88a8bdd-954f-455c-aad1-03b1988afa37-console-serving-cert\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298818 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-registration-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298867 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298945 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/31db1d26-2d05-489e-9177-580797f8897c-secret-volume\") pod \"collect-profiles-29564040-5nxvw\" (UID: \"31db1d26-2d05-489e-9177-580797f8897c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.298973 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a15d24cf-4182-44bb-9d60-33649137cc83-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299000 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd097dc-6461-4bf5-8952-2a0ebd249f35-config\") pod \"console-operator-58897d9998-77tzj\" (UID: \"cfd097dc-6461-4bf5-8952-2a0ebd249f35\") " pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299025 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a57b45d3-c455-4b6c-a56e-6884e65670a0-srv-cert\") pod \"catalog-operator-68c6474976-hp2sc\" (UID: \"a57b45d3-c455-4b6c-a56e-6884e65670a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299050 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a-node-bootstrap-token\") pod \"machine-config-server-9klwh\" (UID: \"6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a\") " pod="openshift-machine-config-operator/machine-config-server-9klwh" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299111 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrh96\" (UniqueName: \"kubernetes.io/projected/32ba09f4-59d7-469c-a882-5564e653e868-kube-api-access-vrh96\") pod \"downloads-7954f5f757-dtbtw\" (UID: \"32ba09f4-59d7-469c-a882-5564e653e868\") " pod="openshift-console/downloads-7954f5f757-dtbtw" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299185 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299213 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjgd5\" (UniqueName: \"kubernetes.io/projected/487d1c97-b703-4f1b-8c77-c23b4366a467-kube-api-access-zjgd5\") pod \"marketplace-operator-79b997595-4ckd5\" (UID: \"487d1c97-b703-4f1b-8c77-c23b4366a467\") " pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299295 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86tcb\" (UniqueName: \"kubernetes.io/projected/9b93428b-357a-4f73-ba6f-c46a4c475a98-kube-api-access-86tcb\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xktt\" (UID: \"9b93428b-357a-4f73-ba6f-c46a4c475a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299337 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3a78872-6429-4e96-af97-024c68c537d7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299385 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np5tk\" (UniqueName: \"kubernetes.io/projected/d3f5ac66-56e5-4477-ac1b-1ef496242243-kube-api-access-np5tk\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299411 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67ea2f51-e62b-497a-98dc-0835647d0d5f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-54jrv\" (UID: \"67ea2f51-e62b-497a-98dc-0835647d0d5f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299437 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299461 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf9xd\" (UniqueName: \"kubernetes.io/projected/a57b45d3-c455-4b6c-a56e-6884e65670a0-kube-api-access-jf9xd\") pod \"catalog-operator-68c6474976-hp2sc\" (UID: \"a57b45d3-c455-4b6c-a56e-6884e65670a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299490 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-csi-data-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299490 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b93428b-357a-4f73-ba6f-c46a4c475a98-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xktt\" (UID: \"9b93428b-357a-4f73-ba6f-c46a4c475a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299517 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3a78872-6429-4e96-af97-024c68c537d7-config\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299544 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299567 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d01fc0fa-9be1-4493-be47-c420ebb5341d-metrics-tls\") pod \"ingress-operator-5b745b69d9-49g5j\" (UID: \"d01fc0fa-9be1-4493-be47-c420ebb5341d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299611 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a15d24cf-4182-44bb-9d60-33649137cc83-trusted-ca\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299638 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3f5ac66-56e5-4477-ac1b-1ef496242243-audit-dir\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299662 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf7xv\" (UniqueName: \"kubernetes.io/projected/be0a5a79-545f-4411-8cb1-d9de4e87d983-kube-api-access-qf7xv\") pod \"control-plane-machine-set-operator-78cbb6b69f-br5r4\" (UID: \"be0a5a79-545f-4411-8cb1-d9de4e87d983\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299699 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7c7f16d-c4d5-4a88-ac61-6de2735da80c-proxy-tls\") pod \"machine-config-operator-74547568cd-kxcdf\" (UID: \"f7c7f16d-c4d5-4a88-ac61-6de2735da80c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299720 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67ea2f51-e62b-497a-98dc-0835647d0d5f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-54jrv\" (UID: \"67ea2f51-e62b-497a-98dc-0835647d0d5f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299742 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgsqh\" (UniqueName: \"kubernetes.io/projected/ff89c47d-35b0-46b1-ad37-cbc600a377a1-kube-api-access-mgsqh\") pod \"machine-config-controller-84d6567774-fkf55\" (UID: \"ff89c47d-35b0-46b1-ad37-cbc600a377a1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299767 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d01fc0fa-9be1-4493-be47-c420ebb5341d-trusted-ca\") pod \"ingress-operator-5b745b69d9-49g5j\" (UID: \"d01fc0fa-9be1-4493-be47-c420ebb5341d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299791 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21001d0e-8d77-4fa1-81b4-7867be45f92a-config\") pod \"machine-approver-56656f9798-npgj8\" (UID: \"21001d0e-8d77-4fa1-81b4-7867be45f92a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299814 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff89c47d-35b0-46b1-ad37-cbc600a377a1-proxy-tls\") pod \"machine-config-controller-84d6567774-fkf55\" (UID: \"ff89c47d-35b0-46b1-ad37-cbc600a377a1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299835 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff89c47d-35b0-46b1-ad37-cbc600a377a1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fkf55\" (UID: \"ff89c47d-35b0-46b1-ad37-cbc600a377a1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299875 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299908 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a57b45d3-c455-4b6c-a56e-6884e65670a0-profile-collector-cert\") pod \"catalog-operator-68c6474976-hp2sc\" (UID: \"a57b45d3-c455-4b6c-a56e-6884e65670a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299951 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2fck\" (UniqueName: \"kubernetes.io/projected/d01fc0fa-9be1-4493-be47-c420ebb5341d-kube-api-access-f2fck\") pod \"ingress-operator-5b745b69d9-49g5j\" (UID: \"d01fc0fa-9be1-4493-be47-c420ebb5341d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.299979 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5qbk\" (UniqueName: \"kubernetes.io/projected/cfd097dc-6461-4bf5-8952-2a0ebd249f35-kube-api-access-z5qbk\") pod \"console-operator-58897d9998-77tzj\" (UID: \"cfd097dc-6461-4bf5-8952-2a0ebd249f35\") " pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300014 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300040 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8jc2\" (UniqueName: \"kubernetes.io/projected/21001d0e-8d77-4fa1-81b4-7867be45f92a-kube-api-access-m8jc2\") pod \"machine-approver-56656f9798-npgj8\" (UID: \"21001d0e-8d77-4fa1-81b4-7867be45f92a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300068 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f25798a1-191b-4db2-86f5-3d49abb6de78-cert\") pod \"ingress-canary-tc72q\" (UID: \"f25798a1-191b-4db2-86f5-3d49abb6de78\") " pod="openshift-ingress-canary/ingress-canary-tc72q" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300091 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v65kz\" (UniqueName: \"kubernetes.io/projected/31db1d26-2d05-489e-9177-580797f8897c-kube-api-access-v65kz\") pod \"collect-profiles-29564040-5nxvw\" (UID: \"31db1d26-2d05-489e-9177-580797f8897c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300154 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67423a72-d377-414a-b293-56f824df4d45-config\") pod \"kube-apiserver-operator-766d6c64bb-2gbt5\" (UID: \"67423a72-d377-414a-b293-56f824df4d45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300183 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq9vw\" (UniqueName: \"kubernetes.io/projected/67ea2f51-e62b-497a-98dc-0835647d0d5f-kube-api-access-xq9vw\") pod \"openshift-controller-manager-operator-756b6f6bc6-54jrv\" (UID: \"67ea2f51-e62b-497a-98dc-0835647d0d5f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300209 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67423a72-d377-414a-b293-56f824df4d45-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2gbt5\" (UID: \"67423a72-d377-414a-b293-56f824df4d45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300232 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a-certs\") pod \"machine-config-server-9klwh\" (UID: \"6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a\") " pod="openshift-machine-config-operator/machine-config-server-9klwh" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300293 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300324 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt5gd\" (UniqueName: \"kubernetes.io/projected/5783835a-afb1-41e5-b868-2fc94a44b7f6-kube-api-access-gt5gd\") pod \"dns-default-vxnd9\" (UID: \"5783835a-afb1-41e5-b868-2fc94a44b7f6\") " pod="openshift-dns/dns-default-vxnd9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300367 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300397 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300420 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b93428b-357a-4f73-ba6f-c46a4c475a98-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xktt\" (UID: \"9b93428b-357a-4f73-ba6f-c46a4c475a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300443 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-socket-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300470 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-mountpoint-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300492 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86szm\" (UniqueName: \"kubernetes.io/projected/7de9ef01-2f54-487b-b0f6-33585560ee17-kube-api-access-86szm\") pod \"olm-operator-6b444d44fb-ml5m2\" (UID: \"7de9ef01-2f54-487b-b0f6-33585560ee17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300548 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3a78872-6429-4e96-af97-024c68c537d7-serving-cert\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300574 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300599 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjcq5\" (UniqueName: \"kubernetes.io/projected/e57e76f1-bc32-4531-bdad-9c2f929c13f1-kube-api-access-bjcq5\") pod \"multus-admission-controller-857f4d67dd-mm2qg\" (UID: \"e57e76f1-bc32-4531-bdad-9c2f929c13f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mm2qg" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300621 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/21001d0e-8d77-4fa1-81b4-7867be45f92a-machine-approver-tls\") pod \"machine-approver-56656f9798-npgj8\" (UID: \"21001d0e-8d77-4fa1-81b4-7867be45f92a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300652 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f7c7f16d-c4d5-4a88-ac61-6de2735da80c-images\") pod \"machine-config-operator-74547568cd-kxcdf\" (UID: \"f7c7f16d-c4d5-4a88-ac61-6de2735da80c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300668 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7m59\" (UniqueName: \"kubernetes.io/projected/f7c7f16d-c4d5-4a88-ac61-6de2735da80c-kube-api-access-b7m59\") pod \"machine-config-operator-74547568cd-kxcdf\" (UID: \"f7c7f16d-c4d5-4a88-ac61-6de2735da80c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300734 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a15d24cf-4182-44bb-9d60-33649137cc83-registry-certificates\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300778 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2fb\" (UniqueName: \"kubernetes.io/projected/66923387-ae05-40a5-9a8b-8de577e30cb1-kube-api-access-vr2fb\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300844 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-console-config\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300870 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-service-ca\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300888 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5783835a-afb1-41e5-b868-2fc94a44b7f6-config-volume\") pod \"dns-default-vxnd9\" (UID: \"5783835a-afb1-41e5-b868-2fc94a44b7f6\") " pod="openshift-dns/dns-default-vxnd9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300940 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-audit-policies\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300968 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3a78872-6429-4e96-af97-024c68c537d7-service-ca-bundle\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.300984 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301002 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/37777140-c5a3-4503-abbd-0d316a535630-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kgtbj\" (UID: \"37777140-c5a3-4503-abbd-0d316a535630\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301020 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e57e76f1-bc32-4531-bdad-9c2f929c13f1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mm2qg\" (UID: \"e57e76f1-bc32-4531-bdad-9c2f929c13f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mm2qg" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301050 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rm8qz\" (UID: \"cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301069 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ht8w\" (UniqueName: \"kubernetes.io/projected/d3a78872-6429-4e96-af97-024c68c537d7-kube-api-access-8ht8w\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301085 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/487d1c97-b703-4f1b-8c77-c23b4366a467-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4ckd5\" (UID: \"487d1c97-b703-4f1b-8c77-c23b4366a467\") " pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301104 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7c7f16d-c4d5-4a88-ac61-6de2735da80c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kxcdf\" (UID: \"f7c7f16d-c4d5-4a88-ac61-6de2735da80c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301138 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301159 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twthr\" (UniqueName: \"kubernetes.io/projected/e2377525-7cea-46ad-821e-834cbb9b9591-kube-api-access-twthr\") pod \"dns-operator-744455d44c-wvqhb\" (UID: \"e2377525-7cea-46ad-821e-834cbb9b9591\") " pod="openshift-dns-operator/dns-operator-744455d44c-wvqhb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301178 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a15d24cf-4182-44bb-9d60-33649137cc83-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301203 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-trusted-ca-bundle\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301220 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-registry-tls\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301235 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-plugins-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301253 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7nl8\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-kube-api-access-w7nl8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301271 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfd097dc-6461-4bf5-8952-2a0ebd249f35-trusted-ca\") pod \"console-operator-58897d9998-77tzj\" (UID: \"cfd097dc-6461-4bf5-8952-2a0ebd249f35\") " pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301313 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-bound-sa-token\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301334 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7de9ef01-2f54-487b-b0f6-33585560ee17-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ml5m2\" (UID: \"7de9ef01-2f54-487b-b0f6-33585560ee17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301354 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xrsm\" (UniqueName: \"kubernetes.io/projected/f25798a1-191b-4db2-86f5-3d49abb6de78-kube-api-access-4xrsm\") pod \"ingress-canary-tc72q\" (UID: \"f25798a1-191b-4db2-86f5-3d49abb6de78\") " pod="openshift-ingress-canary/ingress-canary-tc72q" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.301398 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhxmv\" (UniqueName: \"kubernetes.io/projected/e79e3bba-d970-4daf-983d-aa4894129506-kube-api-access-fhxmv\") pod \"migrator-59844c95c7-qd587\" (UID: \"e79e3bba-d970-4daf-983d-aa4894129506\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qd587" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.304154 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3f5ac66-56e5-4477-ac1b-1ef496242243-audit-dir\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.304932 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/487d1c97-b703-4f1b-8c77-c23b4366a467-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4ckd5\" (UID: \"487d1c97-b703-4f1b-8c77-c23b4366a467\") " pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.307058 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-console-config\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.307429 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67423a72-d377-414a-b293-56f824df4d45-config\") pod \"kube-apiserver-operator-766d6c64bb-2gbt5\" (UID: \"67423a72-d377-414a-b293-56f824df4d45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.307738 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a15d24cf-4182-44bb-9d60-33649137cc83-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.308769 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-trusted-ca-bundle\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.309035 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3a78872-6429-4e96-af97-024c68c537d7-service-ca-bundle\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.309864 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f88a8bdd-954f-455c-aad1-03b1988afa37-console-oauth-config\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.309876 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67ea2f51-e62b-497a-98dc-0835647d0d5f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-54jrv\" (UID: \"67ea2f51-e62b-497a-98dc-0835647d0d5f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.310377 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7c7f16d-c4d5-4a88-ac61-6de2735da80c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kxcdf\" (UID: \"f7c7f16d-c4d5-4a88-ac61-6de2735da80c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.310384 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d01fc0fa-9be1-4493-be47-c420ebb5341d-trusted-ca\") pod \"ingress-operator-5b745b69d9-49g5j\" (UID: \"d01fc0fa-9be1-4493-be47-c420ebb5341d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.310935 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37777140-c5a3-4503-abbd-0d316a535630-serving-cert\") pod \"openshift-config-operator-7777fb866f-kgtbj\" (UID: \"37777140-c5a3-4503-abbd-0d316a535630\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.311837 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.312140 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/259f3fb2-856e-4885-8955-6703997715f9-signing-cabundle\") pod \"service-ca-9c57cc56f-2ptxb\" (UID: \"259f3fb2-856e-4885-8955-6703997715f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.312271 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3a78872-6429-4e96-af97-024c68c537d7-serving-cert\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.312715 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-oauth-serving-cert\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.312748 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.312791 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-service-ca\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.313253 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd097dc-6461-4bf5-8952-2a0ebd249f35-config\") pod \"console-operator-58897d9998-77tzj\" (UID: \"cfd097dc-6461-4bf5-8952-2a0ebd249f35\") " pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.313710 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a57b45d3-c455-4b6c-a56e-6884e65670a0-profile-collector-cert\") pod \"catalog-operator-68c6474976-hp2sc\" (UID: \"a57b45d3-c455-4b6c-a56e-6884e65670a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.313806 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-audit-policies\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.314011 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.314078 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3a78872-6429-4e96-af97-024c68c537d7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.314245 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.314328 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/37777140-c5a3-4503-abbd-0d316a535630-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kgtbj\" (UID: \"37777140-c5a3-4503-abbd-0d316a535630\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.314826 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.318207 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a15d24cf-4182-44bb-9d60-33649137cc83-trusted-ca\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.319629 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfd097dc-6461-4bf5-8952-2a0ebd249f35-trusted-ca\") pod \"console-operator-58897d9998-77tzj\" (UID: \"cfd097dc-6461-4bf5-8952-2a0ebd249f35\") " pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:35 crc kubenswrapper[4756]: E0318 14:03:35.320436 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:35.820421142 +0000 UTC m=+217.134839118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.329282 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3a78872-6429-4e96-af97-024c68c537d7-config\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.330360 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a57b45d3-c455-4b6c-a56e-6884e65670a0-srv-cert\") pod \"catalog-operator-68c6474976-hp2sc\" (UID: \"a57b45d3-c455-4b6c-a56e-6884e65670a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.331288 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" event={"ID":"2faea523-d2d3-46fd-b623-fe2cc8928c8d","Type":"ContainerStarted","Data":"74fa39d2648a05a8e76076823a1c54c34401c70f6ab412dce1f7cdbcc2efce92"} Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.331314 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" event={"ID":"3985e570-6d23-4928-a018-40e9b5868b89","Type":"ContainerStarted","Data":"eccce61de3eb34244a7a41551c52ffb01f8e6a663546192ac5992fa6ca2037ad"} Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.331323 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f9hzx" event={"ID":"d199014d-5b92-48d1-966d-30af0da2e1c2","Type":"ContainerStarted","Data":"951a26b22ad3fa6d57a642cf63f6369dadddb18231a44df462b6ba33fe257ad3"} Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.334219 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a15d24cf-4182-44bb-9d60-33649137cc83-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.335183 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.335218 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.335861 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7c7f16d-c4d5-4a88-ac61-6de2735da80c-proxy-tls\") pod \"machine-config-operator-74547568cd-kxcdf\" (UID: \"f7c7f16d-c4d5-4a88-ac61-6de2735da80c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.338399 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b811256d-cc06-4ab8-a482-5ea91528ad95-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-prmvd\" (UID: \"b811256d-cc06-4ab8-a482-5ea91528ad95\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.339423 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/487d1c97-b703-4f1b-8c77-c23b4366a467-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4ckd5\" (UID: \"487d1c97-b703-4f1b-8c77-c23b4366a467\") " pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.347200 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-registry-tls\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.350153 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67423a72-d377-414a-b293-56f824df4d45-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2gbt5\" (UID: \"67423a72-d377-414a-b293-56f824df4d45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.350207 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67ea2f51-e62b-497a-98dc-0835647d0d5f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-54jrv\" (UID: \"67ea2f51-e62b-497a-98dc-0835647d0d5f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.350501 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/259f3fb2-856e-4885-8955-6703997715f9-signing-key\") pod \"service-ca-9c57cc56f-2ptxb\" (UID: \"259f3fb2-856e-4885-8955-6703997715f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.353627 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b93428b-357a-4f73-ba6f-c46a4c475a98-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xktt\" (UID: \"9b93428b-357a-4f73-ba6f-c46a4c475a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.353694 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rm8qz\" (UID: \"cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.354487 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.354562 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.354600 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfd097dc-6461-4bf5-8952-2a0ebd249f35-serving-cert\") pod \"console-operator-58897d9998-77tzj\" (UID: \"cfd097dc-6461-4bf5-8952-2a0ebd249f35\") " pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.355191 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f88a8bdd-954f-455c-aad1-03b1988afa37-console-serving-cert\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.359615 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d01fc0fa-9be1-4493-be47-c420ebb5341d-metrics-tls\") pod \"ingress-operator-5b745b69d9-49g5j\" (UID: \"d01fc0fa-9be1-4493-be47-c420ebb5341d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.360945 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.363805 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.364302 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.367172 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.368453 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-95vm5"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.368864 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkjw2\" (UniqueName: \"kubernetes.io/projected/b811256d-cc06-4ab8-a482-5ea91528ad95-kube-api-access-mkjw2\") pod \"cluster-samples-operator-665b6dd947-prmvd\" (UID: \"b811256d-cc06-4ab8-a482-5ea91528ad95\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.369393 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx8lv\" (UniqueName: \"kubernetes.io/projected/f88a8bdd-954f-455c-aad1-03b1988afa37-kube-api-access-dx8lv\") pod \"console-f9d7485db-k5xg9\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.383667 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pvzfk"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.387057 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q52c\" (UniqueName: \"kubernetes.io/projected/259f3fb2-856e-4885-8955-6703997715f9-kube-api-access-5q52c\") pod \"service-ca-9c57cc56f-2ptxb\" (UID: \"259f3fb2-856e-4885-8955-6703997715f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" Mar 18 14:03:35 crc kubenswrapper[4756]: W0318 14:03:35.389995 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42d7ef3d_2c3b_455c_9457_44441f1bfcff.slice/crio-599d763dad17aa921a7c1e59f7a1aafcc5bb0c99fdc753fee9999114330b93dd WatchSource:0}: Error finding container 599d763dad17aa921a7c1e59f7a1aafcc5bb0c99fdc753fee9999114330b93dd: Status 404 returned error can't find the container with id 599d763dad17aa921a7c1e59f7a1aafcc5bb0c99fdc753fee9999114330b93dd Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402179 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402346 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a-node-bootstrap-token\") pod \"machine-config-server-9klwh\" (UID: \"6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a\") " pod="openshift-machine-config-operator/machine-config-server-9klwh" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402376 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrh96\" (UniqueName: \"kubernetes.io/projected/32ba09f4-59d7-469c-a882-5564e653e868-kube-api-access-vrh96\") pod \"downloads-7954f5f757-dtbtw\" (UID: \"32ba09f4-59d7-469c-a882-5564e653e868\") " pod="openshift-console/downloads-7954f5f757-dtbtw" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402423 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-csi-data-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402442 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf7xv\" (UniqueName: \"kubernetes.io/projected/be0a5a79-545f-4411-8cb1-d9de4e87d983-kube-api-access-qf7xv\") pod \"control-plane-machine-set-operator-78cbb6b69f-br5r4\" (UID: \"be0a5a79-545f-4411-8cb1-d9de4e87d983\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402470 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgsqh\" (UniqueName: \"kubernetes.io/projected/ff89c47d-35b0-46b1-ad37-cbc600a377a1-kube-api-access-mgsqh\") pod \"machine-config-controller-84d6567774-fkf55\" (UID: \"ff89c47d-35b0-46b1-ad37-cbc600a377a1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402486 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21001d0e-8d77-4fa1-81b4-7867be45f92a-config\") pod \"machine-approver-56656f9798-npgj8\" (UID: \"21001d0e-8d77-4fa1-81b4-7867be45f92a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402502 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff89c47d-35b0-46b1-ad37-cbc600a377a1-proxy-tls\") pod \"machine-config-controller-84d6567774-fkf55\" (UID: \"ff89c47d-35b0-46b1-ad37-cbc600a377a1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402526 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff89c47d-35b0-46b1-ad37-cbc600a377a1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fkf55\" (UID: \"ff89c47d-35b0-46b1-ad37-cbc600a377a1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402556 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8jc2\" (UniqueName: \"kubernetes.io/projected/21001d0e-8d77-4fa1-81b4-7867be45f92a-kube-api-access-m8jc2\") pod \"machine-approver-56656f9798-npgj8\" (UID: \"21001d0e-8d77-4fa1-81b4-7867be45f92a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402572 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f25798a1-191b-4db2-86f5-3d49abb6de78-cert\") pod \"ingress-canary-tc72q\" (UID: \"f25798a1-191b-4db2-86f5-3d49abb6de78\") " pod="openshift-ingress-canary/ingress-canary-tc72q" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402586 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v65kz\" (UniqueName: \"kubernetes.io/projected/31db1d26-2d05-489e-9177-580797f8897c-kube-api-access-v65kz\") pod \"collect-profiles-29564040-5nxvw\" (UID: \"31db1d26-2d05-489e-9177-580797f8897c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402618 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a-certs\") pod \"machine-config-server-9klwh\" (UID: \"6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a\") " pod="openshift-machine-config-operator/machine-config-server-9klwh" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402634 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt5gd\" (UniqueName: \"kubernetes.io/projected/5783835a-afb1-41e5-b868-2fc94a44b7f6-kube-api-access-gt5gd\") pod \"dns-default-vxnd9\" (UID: \"5783835a-afb1-41e5-b868-2fc94a44b7f6\") " pod="openshift-dns/dns-default-vxnd9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402648 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-socket-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402665 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-mountpoint-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402681 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86szm\" (UniqueName: \"kubernetes.io/projected/7de9ef01-2f54-487b-b0f6-33585560ee17-kube-api-access-86szm\") pod \"olm-operator-6b444d44fb-ml5m2\" (UID: \"7de9ef01-2f54-487b-b0f6-33585560ee17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402698 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjcq5\" (UniqueName: \"kubernetes.io/projected/e57e76f1-bc32-4531-bdad-9c2f929c13f1-kube-api-access-bjcq5\") pod \"multus-admission-controller-857f4d67dd-mm2qg\" (UID: \"e57e76f1-bc32-4531-bdad-9c2f929c13f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mm2qg" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402714 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/21001d0e-8d77-4fa1-81b4-7867be45f92a-machine-approver-tls\") pod \"machine-approver-56656f9798-npgj8\" (UID: \"21001d0e-8d77-4fa1-81b4-7867be45f92a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402748 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2fb\" (UniqueName: \"kubernetes.io/projected/66923387-ae05-40a5-9a8b-8de577e30cb1-kube-api-access-vr2fb\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402763 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5783835a-afb1-41e5-b868-2fc94a44b7f6-config-volume\") pod \"dns-default-vxnd9\" (UID: \"5783835a-afb1-41e5-b868-2fc94a44b7f6\") " pod="openshift-dns/dns-default-vxnd9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402781 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e57e76f1-bc32-4531-bdad-9c2f929c13f1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mm2qg\" (UID: \"e57e76f1-bc32-4531-bdad-9c2f929c13f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mm2qg" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402811 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twthr\" (UniqueName: \"kubernetes.io/projected/e2377525-7cea-46ad-821e-834cbb9b9591-kube-api-access-twthr\") pod \"dns-operator-744455d44c-wvqhb\" (UID: \"e2377525-7cea-46ad-821e-834cbb9b9591\") " pod="openshift-dns-operator/dns-operator-744455d44c-wvqhb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402828 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-plugins-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.402855 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7de9ef01-2f54-487b-b0f6-33585560ee17-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ml5m2\" (UID: \"7de9ef01-2f54-487b-b0f6-33585560ee17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" Mar 18 14:03:35 crc kubenswrapper[4756]: E0318 14:03:35.403406 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:35.903390349 +0000 UTC m=+217.217808324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.404234 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xrsm\" (UniqueName: \"kubernetes.io/projected/f25798a1-191b-4db2-86f5-3d49abb6de78-kube-api-access-4xrsm\") pod \"ingress-canary-tc72q\" (UID: \"f25798a1-191b-4db2-86f5-3d49abb6de78\") " pod="openshift-ingress-canary/ingress-canary-tc72q" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.404274 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/be0a5a79-545f-4411-8cb1-d9de4e87d983-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-br5r4\" (UID: \"be0a5a79-545f-4411-8cb1-d9de4e87d983\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.404299 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2377525-7cea-46ad-821e-834cbb9b9591-metrics-tls\") pod \"dns-operator-744455d44c-wvqhb\" (UID: \"e2377525-7cea-46ad-821e-834cbb9b9591\") " pod="openshift-dns-operator/dns-operator-744455d44c-wvqhb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.404314 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7de9ef01-2f54-487b-b0f6-33585560ee17-srv-cert\") pod \"olm-operator-6b444d44fb-ml5m2\" (UID: \"7de9ef01-2f54-487b-b0f6-33585560ee17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.404343 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31db1d26-2d05-489e-9177-580797f8897c-config-volume\") pod \"collect-profiles-29564040-5nxvw\" (UID: \"31db1d26-2d05-489e-9177-580797f8897c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.404364 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5783835a-afb1-41e5-b868-2fc94a44b7f6-metrics-tls\") pod \"dns-default-vxnd9\" (UID: \"5783835a-afb1-41e5-b868-2fc94a44b7f6\") " pod="openshift-dns/dns-default-vxnd9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.404379 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx2ch\" (UniqueName: \"kubernetes.io/projected/6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a-kube-api-access-sx2ch\") pod \"machine-config-server-9klwh\" (UID: \"6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a\") " pod="openshift-machine-config-operator/machine-config-server-9klwh" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.404401 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21001d0e-8d77-4fa1-81b4-7867be45f92a-auth-proxy-config\") pod \"machine-approver-56656f9798-npgj8\" (UID: \"21001d0e-8d77-4fa1-81b4-7867be45f92a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.404426 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-registration-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.404445 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/31db1d26-2d05-489e-9177-580797f8897c-secret-volume\") pod \"collect-profiles-29564040-5nxvw\" (UID: \"31db1d26-2d05-489e-9177-580797f8897c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.404675 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21001d0e-8d77-4fa1-81b4-7867be45f92a-config\") pod \"machine-approver-56656f9798-npgj8\" (UID: \"21001d0e-8d77-4fa1-81b4-7867be45f92a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.404841 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-csi-data-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.405080 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-socket-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.405502 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-mountpoint-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.406711 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31db1d26-2d05-489e-9177-580797f8897c-config-volume\") pod \"collect-profiles-29564040-5nxvw\" (UID: \"31db1d26-2d05-489e-9177-580797f8897c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.406834 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff89c47d-35b0-46b1-ad37-cbc600a377a1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fkf55\" (UID: \"ff89c47d-35b0-46b1-ad37-cbc600a377a1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.408039 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21001d0e-8d77-4fa1-81b4-7867be45f92a-auth-proxy-config\") pod \"machine-approver-56656f9798-npgj8\" (UID: \"21001d0e-8d77-4fa1-81b4-7867be45f92a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.408270 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-registration-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.408320 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/66923387-ae05-40a5-9a8b-8de577e30cb1-plugins-dir\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.408846 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7m59\" (UniqueName: \"kubernetes.io/projected/f7c7f16d-c4d5-4a88-ac61-6de2735da80c-kube-api-access-b7m59\") pod \"machine-config-operator-74547568cd-kxcdf\" (UID: \"f7c7f16d-c4d5-4a88-ac61-6de2735da80c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.410055 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5783835a-afb1-41e5-b868-2fc94a44b7f6-config-volume\") pod \"dns-default-vxnd9\" (UID: \"5783835a-afb1-41e5-b868-2fc94a44b7f6\") " pod="openshift-dns/dns-default-vxnd9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.416484 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/31db1d26-2d05-489e-9177-580797f8897c-secret-volume\") pod \"collect-profiles-29564040-5nxvw\" (UID: \"31db1d26-2d05-489e-9177-580797f8897c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.420633 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.423468 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.428227 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7de9ef01-2f54-487b-b0f6-33585560ee17-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ml5m2\" (UID: \"7de9ef01-2f54-487b-b0f6-33585560ee17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.429183 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2377525-7cea-46ad-821e-834cbb9b9591-metrics-tls\") pod \"dns-operator-744455d44c-wvqhb\" (UID: \"e2377525-7cea-46ad-821e-834cbb9b9591\") " pod="openshift-dns-operator/dns-operator-744455d44c-wvqhb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.429348 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f25798a1-191b-4db2-86f5-3d49abb6de78-cert\") pod \"ingress-canary-tc72q\" (UID: \"f25798a1-191b-4db2-86f5-3d49abb6de78\") " pod="openshift-ingress-canary/ingress-canary-tc72q" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.435767 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff89c47d-35b0-46b1-ad37-cbc600a377a1-proxy-tls\") pod \"machine-config-controller-84d6567774-fkf55\" (UID: \"ff89c47d-35b0-46b1-ad37-cbc600a377a1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.438051 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.438856 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a-node-bootstrap-token\") pod \"machine-config-server-9klwh\" (UID: \"6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a\") " pod="openshift-machine-config-operator/machine-config-server-9klwh" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.438976 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a-certs\") pod \"machine-config-server-9klwh\" (UID: \"6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a\") " pod="openshift-machine-config-operator/machine-config-server-9klwh" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.439405 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.439993 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/21001d0e-8d77-4fa1-81b4-7867be45f92a-machine-approver-tls\") pod \"machine-approver-56656f9798-npgj8\" (UID: \"21001d0e-8d77-4fa1-81b4-7867be45f92a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.441094 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5783835a-afb1-41e5-b868-2fc94a44b7f6-metrics-tls\") pod \"dns-default-vxnd9\" (UID: \"5783835a-afb1-41e5-b868-2fc94a44b7f6\") " pod="openshift-dns/dns-default-vxnd9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.443818 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e57e76f1-bc32-4531-bdad-9c2f929c13f1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mm2qg\" (UID: \"e57e76f1-bc32-4531-bdad-9c2f929c13f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mm2qg" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.445785 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7de9ef01-2f54-487b-b0f6-33585560ee17-srv-cert\") pod \"olm-operator-6b444d44fb-ml5m2\" (UID: \"7de9ef01-2f54-487b-b0f6-33585560ee17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.446374 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjgd5\" (UniqueName: \"kubernetes.io/projected/487d1c97-b703-4f1b-8c77-c23b4366a467-kube-api-access-zjgd5\") pod \"marketplace-operator-79b997595-4ckd5\" (UID: \"487d1c97-b703-4f1b-8c77-c23b4366a467\") " pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.446403 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/be0a5a79-545f-4411-8cb1-d9de4e87d983-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-br5r4\" (UID: \"be0a5a79-545f-4411-8cb1-d9de4e87d983\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.448481 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx8mb\" (UniqueName: \"kubernetes.io/projected/37777140-c5a3-4503-abbd-0d316a535630-kube-api-access-cx8mb\") pod \"openshift-config-operator-7777fb866f-kgtbj\" (UID: \"37777140-c5a3-4503-abbd-0d316a535630\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.462141 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j7xj\" (UniqueName: \"kubernetes.io/projected/cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c-kube-api-access-6j7xj\") pod \"package-server-manager-789f6589d5-rm8qz\" (UID: \"cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" Mar 18 14:03:35 crc kubenswrapper[4756]: W0318 14:03:35.467994 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7298879_219c_4329_a9e6_1854fc22b1d9.slice/crio-f0796f574574505b6495ce6c5686fea658ee8038faa65ecaca7e741555c32476 WatchSource:0}: Error finding container f0796f574574505b6495ce6c5686fea658ee8038faa65ecaca7e741555c32476: Status 404 returned error can't find the container with id f0796f574574505b6495ce6c5686fea658ee8038faa65ecaca7e741555c32476 Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.483828 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2fck\" (UniqueName: \"kubernetes.io/projected/d01fc0fa-9be1-4493-be47-c420ebb5341d-kube-api-access-f2fck\") pod \"ingress-operator-5b745b69d9-49g5j\" (UID: \"d01fc0fa-9be1-4493-be47-c420ebb5341d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.502714 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5qbk\" (UniqueName: \"kubernetes.io/projected/cfd097dc-6461-4bf5-8952-2a0ebd249f35-kube-api-access-z5qbk\") pod \"console-operator-58897d9998-77tzj\" (UID: \"cfd097dc-6461-4bf5-8952-2a0ebd249f35\") " pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.509272 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: E0318 14:03:35.509730 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:36.009717593 +0000 UTC m=+217.324135568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.538435 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq9vw\" (UniqueName: \"kubernetes.io/projected/67ea2f51-e62b-497a-98dc-0835647d0d5f-kube-api-access-xq9vw\") pod \"openshift-controller-manager-operator-756b6f6bc6-54jrv\" (UID: \"67ea2f51-e62b-497a-98dc-0835647d0d5f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.543164 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67423a72-d377-414a-b293-56f824df4d45-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2gbt5\" (UID: \"67423a72-d377-414a-b293-56f824df4d45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.586418 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ht8w\" (UniqueName: \"kubernetes.io/projected/d3a78872-6429-4e96-af97-024c68c537d7-kube-api-access-8ht8w\") pod \"authentication-operator-69f744f599-9s7gz\" (UID: \"d3a78872-6429-4e96-af97-024c68c537d7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.603039 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86tcb\" (UniqueName: \"kubernetes.io/projected/9b93428b-357a-4f73-ba6f-c46a4c475a98-kube-api-access-86tcb\") pod \"kube-storage-version-migrator-operator-b67b599dd-2xktt\" (UID: \"9b93428b-357a-4f73-ba6f-c46a4c475a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.610001 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:35 crc kubenswrapper[4756]: E0318 14:03:35.610156 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:36.110132197 +0000 UTC m=+217.424550172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.610487 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: E0318 14:03:35.610995 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:36.11097909 +0000 UTC m=+217.425397065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.611429 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.617312 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.621205 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.624388 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np5tk\" (UniqueName: \"kubernetes.io/projected/d3f5ac66-56e5-4477-ac1b-1ef496242243-kube-api-access-np5tk\") pod \"oauth-openshift-558db77b4-9sqqb\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.626881 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.640533 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.642828 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.647237 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf9xd\" (UniqueName: \"kubernetes.io/projected/a57b45d3-c455-4b6c-a56e-6884e65670a0-kube-api-access-jf9xd\") pod \"catalog-operator-68c6474976-hp2sc\" (UID: \"a57b45d3-c455-4b6c-a56e-6884e65670a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.653759 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.661365 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.665947 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7nl8\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-kube-api-access-w7nl8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.680652 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.684375 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d01fc0fa-9be1-4493-be47-c420ebb5341d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-49g5j\" (UID: \"d01fc0fa-9be1-4493-be47-c420ebb5341d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:35 crc kubenswrapper[4756]: W0318 14:03:35.692275 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode271e583_b5d3_452e_a009_f8b21a8121d9.slice/crio-c199295e8136756badb1400dc2facdd15f911edf1010a3d8fa873e065ac66cf9 WatchSource:0}: Error finding container c199295e8136756badb1400dc2facdd15f911edf1010a3d8fa873e065ac66cf9: Status 404 returned error can't find the container with id c199295e8136756badb1400dc2facdd15f911edf1010a3d8fa873e065ac66cf9 Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.705788 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-bound-sa-token\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.706337 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.712031 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:35 crc kubenswrapper[4756]: E0318 14:03:35.712464 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:36.212448222 +0000 UTC m=+217.526866197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.729243 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhxmv\" (UniqueName: \"kubernetes.io/projected/e79e3bba-d970-4daf-983d-aa4894129506-kube-api-access-fhxmv\") pod \"migrator-59844c95c7-qd587\" (UID: \"e79e3bba-d970-4daf-983d-aa4894129506\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qd587" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.731080 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564042-6mtlx"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.738761 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.747208 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.749325 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.750662 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.761279 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.762775 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v65kz\" (UniqueName: \"kubernetes.io/projected/31db1d26-2d05-489e-9177-580797f8897c-kube-api-access-v65kz\") pod \"collect-profiles-29564040-5nxvw\" (UID: \"31db1d26-2d05-489e-9177-580797f8897c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.769290 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjcq5\" (UniqueName: \"kubernetes.io/projected/e57e76f1-bc32-4531-bdad-9c2f929c13f1-kube-api-access-bjcq5\") pod \"multus-admission-controller-857f4d67dd-mm2qg\" (UID: \"e57e76f1-bc32-4531-bdad-9c2f929c13f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mm2qg" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.785054 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrh96\" (UniqueName: \"kubernetes.io/projected/32ba09f4-59d7-469c-a882-5564e653e868-kube-api-access-vrh96\") pod \"downloads-7954f5f757-dtbtw\" (UID: \"32ba09f4-59d7-469c-a882-5564e653e868\") " pod="openshift-console/downloads-7954f5f757-dtbtw" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.807767 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-588c8"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.810373 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf7xv\" (UniqueName: \"kubernetes.io/projected/be0a5a79-545f-4411-8cb1-d9de4e87d983-kube-api-access-qf7xv\") pod \"control-plane-machine-set-operator-78cbb6b69f-br5r4\" (UID: \"be0a5a79-545f-4411-8cb1-d9de4e87d983\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.810659 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.813999 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:35 crc kubenswrapper[4756]: E0318 14:03:35.814570 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:36.314555532 +0000 UTC m=+217.628973507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.815003 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.852561 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xrsm\" (UniqueName: \"kubernetes.io/projected/f25798a1-191b-4db2-86f5-3d49abb6de78-kube-api-access-4xrsm\") pod \"ingress-canary-tc72q\" (UID: \"f25798a1-191b-4db2-86f5-3d49abb6de78\") " pod="openshift-ingress-canary/ingress-canary-tc72q" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.859432 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgsqh\" (UniqueName: \"kubernetes.io/projected/ff89c47d-35b0-46b1-ad37-cbc600a377a1-kube-api-access-mgsqh\") pod \"machine-config-controller-84d6567774-fkf55\" (UID: \"ff89c47d-35b0-46b1-ad37-cbc600a377a1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.865281 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.866258 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.871479 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8jc2\" (UniqueName: \"kubernetes.io/projected/21001d0e-8d77-4fa1-81b4-7867be45f92a-kube-api-access-m8jc2\") pod \"machine-approver-56656f9798-npgj8\" (UID: \"21001d0e-8d77-4fa1-81b4-7867be45f92a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.889497 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj"] Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.895206 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx2ch\" (UniqueName: \"kubernetes.io/projected/6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a-kube-api-access-sx2ch\") pod \"machine-config-server-9klwh\" (UID: \"6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a\") " pod="openshift-machine-config-operator/machine-config-server-9klwh" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.898830 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.904424 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qd587" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.910355 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt5gd\" (UniqueName: \"kubernetes.io/projected/5783835a-afb1-41e5-b868-2fc94a44b7f6-kube-api-access-gt5gd\") pod \"dns-default-vxnd9\" (UID: \"5783835a-afb1-41e5-b868-2fc94a44b7f6\") " pod="openshift-dns/dns-default-vxnd9" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.922273 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:35 crc kubenswrapper[4756]: E0318 14:03:35.923011 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:36.422998132 +0000 UTC m=+217.737416107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.929201 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86szm\" (UniqueName: \"kubernetes.io/projected/7de9ef01-2f54-487b-b0f6-33585560ee17-kube-api-access-86szm\") pod \"olm-operator-6b444d44fb-ml5m2\" (UID: \"7de9ef01-2f54-487b-b0f6-33585560ee17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.947326 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twthr\" (UniqueName: \"kubernetes.io/projected/e2377525-7cea-46ad-821e-834cbb9b9591-kube-api-access-twthr\") pod \"dns-operator-744455d44c-wvqhb\" (UID: \"e2377525-7cea-46ad-821e-834cbb9b9591\") " pod="openshift-dns-operator/dns-operator-744455d44c-wvqhb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.949918 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mm2qg" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.955879 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.962555 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wvqhb" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.963224 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2fb\" (UniqueName: \"kubernetes.io/projected/66923387-ae05-40a5-9a8b-8de577e30cb1-kube-api-access-vr2fb\") pod \"csi-hostpathplugin-p9mcq\" (UID: \"66923387-ae05-40a5-9a8b-8de577e30cb1\") " pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.970731 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4" Mar 18 14:03:35 crc kubenswrapper[4756]: I0318 14:03:35.979521 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.007360 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dtbtw" Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.014200 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.023835 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz"] Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.027935 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:36 crc kubenswrapper[4756]: E0318 14:03:36.028446 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:36.528434532 +0000 UTC m=+217.842852507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.028755 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.036424 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tc72q" Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.047441 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vxnd9" Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.066320 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.069333 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9klwh" Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.131133 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:36 crc kubenswrapper[4756]: E0318 14:03:36.131852 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:36.631836296 +0000 UTC m=+217.946254271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.150330 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4ckd5"] Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.198870 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv"] Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.217673 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt"] Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.235927 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:36 crc kubenswrapper[4756]: E0318 14:03:36.236288 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:36.736274269 +0000 UTC m=+218.050692254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.291042 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-k5xg9"] Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.320587 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2ptxb"] Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.324948 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5"] Mar 18 14:03:36 crc kubenswrapper[4756]: W0318 14:03:36.330489 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf88a8bdd_954f_455c_aad1_03b1988afa37.slice/crio-22e3b1e94cef2835a0cfbc2dc20b571b8dc0198acbfab345e7e9b7ebba9d182c WatchSource:0}: Error finding container 22e3b1e94cef2835a0cfbc2dc20b571b8dc0198acbfab345e7e9b7ebba9d182c: Status 404 returned error can't find the container with id 22e3b1e94cef2835a0cfbc2dc20b571b8dc0198acbfab345e7e9b7ebba9d182c Mar 18 14:03:36 crc kubenswrapper[4756]: W0318 14:03:36.333813 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b93428b_357a_4f73_ba6f_c46a4c475a98.slice/crio-5fb8fd026ec996760dbbedd98d3cd0fe2deac551821fbaa519b93d6ca131d0ac WatchSource:0}: Error finding container 5fb8fd026ec996760dbbedd98d3cd0fe2deac551821fbaa519b93d6ca131d0ac: Status 404 returned error can't find the container with id 5fb8fd026ec996760dbbedd98d3cd0fe2deac551821fbaa519b93d6ca131d0ac Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.336763 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:36 crc kubenswrapper[4756]: E0318 14:03:36.336873 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:36.836853058 +0000 UTC m=+218.151271043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.337149 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:36 crc kubenswrapper[4756]: E0318 14:03:36.337545 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:36.837531946 +0000 UTC m=+218.151949921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.351751 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" event={"ID":"8cad737e-466f-4c7c-b004-c4723692f479","Type":"ContainerStarted","Data":"08e8f5106f98468b447a38b5007761165a4fad3ae4b6df986c385b712f2dd340"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.351794 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" event={"ID":"8cad737e-466f-4c7c-b004-c4723692f479","Type":"ContainerStarted","Data":"0c3f1d58baace362ad76728d5221f115b20c104872cdf3d1630192958241d8e2"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.356769 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" event={"ID":"67ea2f51-e62b-497a-98dc-0835647d0d5f","Type":"ContainerStarted","Data":"47ef6429ed4b0f424b237f1a6c8ed7c20abeb7c98961d10cd0b93a6021f979d1"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.359097 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" event={"ID":"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2","Type":"ContainerStarted","Data":"67e1b01ee217e330d96e8fdbab351fb318fb47774ac99a3d910c63aa9326c111"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.361885 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" event={"ID":"42d7ef3d-2c3b-455c-9457-44441f1bfcff","Type":"ContainerStarted","Data":"afe0d69f73678372c93451257c7d5664926ee3689192b71e6d9cbd44bd82813a"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.361908 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" event={"ID":"42d7ef3d-2c3b-455c-9457-44441f1bfcff","Type":"ContainerStarted","Data":"599d763dad17aa921a7c1e59f7a1aafcc5bb0c99fdc753fee9999114330b93dd"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.362528 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.368291 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9sqqb"] Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.373186 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" event={"ID":"2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4","Type":"ContainerStarted","Data":"ceb6ee0724d4744817a7bc883c7174bc1478c73f0ab0044b55633cb6dcd46c16"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.373217 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" event={"ID":"2c3f9520-d8f8-4ec1-b4c0-9ca1baed86c4","Type":"ContainerStarted","Data":"ad5afad715701b1af6ffb7e5270bc6c68faf188f216c75c31b23c8b635fcd086"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.384720 4756 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-95vm5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.384760 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" podUID="42d7ef3d-2c3b-455c-9457-44441f1bfcff" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.413429 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564042-6mtlx" event={"ID":"bfa461e5-a4e9-4cfa-a279-df6d4a56c973","Type":"ContainerStarted","Data":"f8c5af48c5c990f9985ba3860bf98684cab494c7f98cef18b658e6d02972a16c"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.425810 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" event={"ID":"37777140-c5a3-4503-abbd-0d316a535630","Type":"ContainerStarted","Data":"5658525eab7ed21a79d0a951112fdbcfbe6ca30b4c9970739a4be21ef94ced1e"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.439020 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.441632 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc"] Mar 18 14:03:36 crc kubenswrapper[4756]: E0318 14:03:36.441980 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:36.941097915 +0000 UTC m=+218.255515890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.453260 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" event={"ID":"487d1c97-b703-4f1b-8c77-c23b4366a467","Type":"ContainerStarted","Data":"7bda1623aa31ed92a9284992313e5eedb4347d81950af29a664bcc7c08f568d9"} Mar 18 14:03:36 crc kubenswrapper[4756]: W0318 14:03:36.457338 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3f5ac66_56e5_4477_ac1b_1ef496242243.slice/crio-685be2bf704064d590cd43d659e3dde4deb31eaa99fe1903e29d70a8c32d1b0c WatchSource:0}: Error finding container 685be2bf704064d590cd43d659e3dde4deb31eaa99fe1903e29d70a8c32d1b0c: Status 404 returned error can't find the container with id 685be2bf704064d590cd43d659e3dde4deb31eaa99fe1903e29d70a8c32d1b0c Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.461471 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" event={"ID":"f7c7f16d-c4d5-4a88-ac61-6de2735da80c","Type":"ContainerStarted","Data":"34f6188c755fc36ba7054e3126eca71d870010ab7c3c7cc607afc1c16afadf05"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.474524 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f9hzx" event={"ID":"d199014d-5b92-48d1-966d-30af0da2e1c2","Type":"ContainerStarted","Data":"27a1dc3d4719731f2a98c0cf8376c261a6a11f137a414a7c3afd28c9a6c0763c"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.480022 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" event={"ID":"e271e583-b5d3-452e-a009-f8b21a8121d9","Type":"ContainerStarted","Data":"68ba4f7b44f79af68cb5dc1a3b4b8560189f64da299ba35048a528ba71111204"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.480062 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" event={"ID":"e271e583-b5d3-452e-a009-f8b21a8121d9","Type":"ContainerStarted","Data":"c199295e8136756badb1400dc2facdd15f911edf1010a3d8fa873e065ac66cf9"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.485870 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" event={"ID":"3985e570-6d23-4928-a018-40e9b5868b89","Type":"ContainerStarted","Data":"afaa22dd5874f0d98fbd58ab6469d91a9eb69a4250f64a93894735e7d2647ab1"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.485911 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" event={"ID":"3985e570-6d23-4928-a018-40e9b5868b89","Type":"ContainerStarted","Data":"a331791debdeb0ace25c1e974e96f27a4cff5afead921fc2f7d874508b6a5e06"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.487634 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" event={"ID":"efc607b9-67d2-4f27-8ebc-03f067d11caf","Type":"ContainerStarted","Data":"4df6331934ffaf0f922f6ce66aeba361b646a312989b42f2d7634e9a01e077e7"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.488727 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" event={"ID":"2faea523-d2d3-46fd-b623-fe2cc8928c8d","Type":"ContainerStarted","Data":"8128ab0ef829539581a84f5ee63b81fa71a2913a6dcd78edda99c80465f096e4"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.491899 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" event={"ID":"f6816a11-7f42-40ca-9bbe-ec5f3d0d019a","Type":"ContainerStarted","Data":"9bc999e68d2e1cfb4540e9b1183efe979be4186717a8eb0d5ec7772360c879cf"} Mar 18 14:03:36 crc kubenswrapper[4756]: W0318 14:03:36.504817 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21001d0e_8d77_4fa1_81b4_7867be45f92a.slice/crio-2fd5d8725fefb4b277c48c29622b533d00224caa8a29ac0e51e2cf3ddc1396dd WatchSource:0}: Error finding container 2fd5d8725fefb4b277c48c29622b533d00224caa8a29ac0e51e2cf3ddc1396dd: Status 404 returned error can't find the container with id 2fd5d8725fefb4b277c48c29622b533d00224caa8a29ac0e51e2cf3ddc1396dd Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.508436 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" event={"ID":"c7298879-219c-4329-a9e6-1854fc22b1d9","Type":"ContainerStarted","Data":"2e1be6563bbec9f91c02010f15b1c501f61e9a251a0770ae223309e7c5eb1c7a"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.509262 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" event={"ID":"c7298879-219c-4329-a9e6-1854fc22b1d9","Type":"ContainerStarted","Data":"f0796f574574505b6495ce6c5686fea658ee8038faa65ecaca7e741555c32476"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.509283 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.511355 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7067572-3edd-48f1-a03d-7f83887ab9ca" containerID="59f377214e0d3965a97073fe7bc1f9ef2aa636650903730fbe1918bdb7d26c63" exitCode=0 Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.512031 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" event={"ID":"c7067572-3edd-48f1-a03d-7f83887ab9ca","Type":"ContainerDied","Data":"59f377214e0d3965a97073fe7bc1f9ef2aa636650903730fbe1918bdb7d26c63"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.512058 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" event={"ID":"c7067572-3edd-48f1-a03d-7f83887ab9ca","Type":"ContainerStarted","Data":"c6016e56f0a1788ead0c6f53855ad7483f796505e184d22f44cb50adea97f90e"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.516186 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9s7gz"] Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.516780 4756 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kld5c container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" start-of-body= Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.516824 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" podUID="c7298879-219c-4329-a9e6-1854fc22b1d9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.526370 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" event={"ID":"cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c","Type":"ContainerStarted","Data":"ca9e59ec97ffa8be3dbbba26fb3d6256df4aa79c4223fbdf9f9449ccca67d460"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.540419 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:36 crc kubenswrapper[4756]: E0318 14:03:36.541895 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:37.041878749 +0000 UTC m=+218.356296724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.564968 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd" event={"ID":"b811256d-cc06-4ab8-a482-5ea91528ad95","Type":"ContainerStarted","Data":"271c12c711009eab8d6082494214d4875f4970cb238a54d08c2bb1a1fc303fa9"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.592546 4756 generic.go:334] "Generic (PLEG): container finished" podID="5f7750ad-9fab-43ab-bcf2-fc8eaa797013" containerID="ebaf157f2a771d145b2037299e50ad81bbeef94dd11cba08393924cc8f762a77" exitCode=0 Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.592607 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" event={"ID":"5f7750ad-9fab-43ab-bcf2-fc8eaa797013","Type":"ContainerDied","Data":"ebaf157f2a771d145b2037299e50ad81bbeef94dd11cba08393924cc8f762a77"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.592644 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" event={"ID":"5f7750ad-9fab-43ab-bcf2-fc8eaa797013","Type":"ContainerStarted","Data":"1354278e07b5c7df4336e6d7cd04257689d6772a6a0fa486bfb91ca9731850e5"} Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.624232 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4"] Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.626404 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wvqhb"] Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.642372 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:36 crc kubenswrapper[4756]: E0318 14:03:36.643775 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:37.143755503 +0000 UTC m=+218.458173478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.741787 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tc72q"] Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.745067 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:36 crc kubenswrapper[4756]: E0318 14:03:36.745409 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:37.245396421 +0000 UTC m=+218.559814396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.844677 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qd587"] Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.846032 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:36 crc kubenswrapper[4756]: E0318 14:03:36.846483 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:37.346467983 +0000 UTC m=+218.660885948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.949390 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:36 crc kubenswrapper[4756]: E0318 14:03:36.950049 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:37.450030832 +0000 UTC m=+218.764448807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.979700 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-77tzj"] Mar 18 14:03:36 crc kubenswrapper[4756]: I0318 14:03:36.992321 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j"] Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.046378 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dtbtw"] Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.047735 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" podStartSLOduration=161.047724394 podStartE2EDuration="2m41.047724394s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:37.040960412 +0000 UTC m=+218.355378377" watchObservedRunningTime="2026-03-18 14:03:37.047724394 +0000 UTC m=+218.362142369" Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.053456 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:37 crc kubenswrapper[4756]: E0318 14:03:37.053725 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:37.553710914 +0000 UTC m=+218.868128889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.078006 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55"] Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.099773 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw"] Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.128073 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mm2qg"] Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.144443 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-f9hzx" podStartSLOduration=161.144425868 podStartE2EDuration="2m41.144425868s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:37.143347399 +0000 UTC m=+218.457765374" watchObservedRunningTime="2026-03-18 14:03:37.144425868 +0000 UTC m=+218.458843843" Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.159803 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:37 crc kubenswrapper[4756]: E0318 14:03:37.160639 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:37.660625453 +0000 UTC m=+218.975043428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.162464 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2"] Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.197928 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.198621 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rxqlj" podStartSLOduration=161.198602242 podStartE2EDuration="2m41.198602242s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:37.197870122 +0000 UTC m=+218.512288097" watchObservedRunningTime="2026-03-18 14:03:37.198602242 +0000 UTC m=+218.513020217" Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.204036 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:37 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:37 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:37 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.204089 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:37 crc kubenswrapper[4756]: W0318 14:03:37.240034 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7de9ef01_2f54_487b_b0f6_33585560ee17.slice/crio-f3fe941363e86827c953e61f25609e1846120619d388d0f59df80e984d49fd73 WatchSource:0}: Error finding container f3fe941363e86827c953e61f25609e1846120619d388d0f59df80e984d49fd73: Status 404 returned error can't find the container with id f3fe941363e86827c953e61f25609e1846120619d388d0f59df80e984d49fd73 Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.261354 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:37 crc kubenswrapper[4756]: E0318 14:03:37.261714 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:37.761697205 +0000 UTC m=+219.076115180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:37 crc kubenswrapper[4756]: W0318 14:03:37.271203 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode57e76f1_bc32_4531_bdad_9c2f929c13f1.slice/crio-c68530e23fc31565bf848cc0ac9e48f0df3d26ba5d615ad1918a6447e738ebb0 WatchSource:0}: Error finding container c68530e23fc31565bf848cc0ac9e48f0df3d26ba5d615ad1918a6447e738ebb0: Status 404 returned error can't find the container with id c68530e23fc31565bf848cc0ac9e48f0df3d26ba5d615ad1918a6447e738ebb0 Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.304815 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-p9mcq"] Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.336941 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" podStartSLOduration=161.336920644 podStartE2EDuration="2m41.336920644s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:37.33340618 +0000 UTC m=+218.647824155" watchObservedRunningTime="2026-03-18 14:03:37.336920644 +0000 UTC m=+218.651338619" Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.363299 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:37 crc kubenswrapper[4756]: E0318 14:03:37.363679 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:37.863667581 +0000 UTC m=+219.178085556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:37 crc kubenswrapper[4756]: W0318 14:03:37.380851 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66923387_ae05_40a5_9a8b_8de577e30cb1.slice/crio-56ef3dac0c9703817a04d742162db64a19afc147e75436b054bd30f7b95a62b4 WatchSource:0}: Error finding container 56ef3dac0c9703817a04d742162db64a19afc147e75436b054bd30f7b95a62b4: Status 404 returned error can't find the container with id 56ef3dac0c9703817a04d742162db64a19afc147e75436b054bd30f7b95a62b4 Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.401448 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vxnd9"] Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.413644 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.468507 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:37 crc kubenswrapper[4756]: E0318 14:03:37.469084 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:37.96906793 +0000 UTC m=+219.283485895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.571318 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.571343 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-rs6j9" podStartSLOduration=161.571323734 podStartE2EDuration="2m41.571323734s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:37.50299315 +0000 UTC m=+218.817411125" watchObservedRunningTime="2026-03-18 14:03:37.571323734 +0000 UTC m=+218.885741709" Mar 18 14:03:37 crc kubenswrapper[4756]: E0318 14:03:37.571985 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:38.071972551 +0000 UTC m=+219.386390526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.616549 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xck8l" podStartSLOduration=161.616529297 podStartE2EDuration="2m41.616529297s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:37.614652436 +0000 UTC m=+218.929070421" watchObservedRunningTime="2026-03-18 14:03:37.616529297 +0000 UTC m=+218.930947282" Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.622034 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qd587" event={"ID":"e79e3bba-d970-4daf-983d-aa4894129506","Type":"ContainerStarted","Data":"cb3ed3408de81b643ddf53078201e319b8a019409088cc42b1efa7955754bae7"} Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.635425 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" event={"ID":"efc607b9-67d2-4f27-8ebc-03f067d11caf","Type":"ContainerStarted","Data":"45f57be0eb5f1feffc491a682659dfb990befe5a95e59d998e6450223ba9b3f1"} Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.643706 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" event={"ID":"d3f5ac66-56e5-4477-ac1b-1ef496242243","Type":"ContainerStarted","Data":"685be2bf704064d590cd43d659e3dde4deb31eaa99fe1903e29d70a8c32d1b0c"} Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.651891 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" event={"ID":"31db1d26-2d05-489e-9177-580797f8897c","Type":"ContainerStarted","Data":"67546d7535e9cf4202b1b853b76e2d9d0ec55d4ae9e108cafc25da469f6e947e"} Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.656406 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dtbtw" event={"ID":"32ba09f4-59d7-469c-a882-5564e653e868","Type":"ContainerStarted","Data":"81b3922eacaf2311a374ddb89f0662b9665901ab9a70cbc512f3b34f0514a4b3"} Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.662367 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" event={"ID":"21001d0e-8d77-4fa1-81b4-7867be45f92a","Type":"ContainerStarted","Data":"2fd5d8725fefb4b277c48c29622b533d00224caa8a29ac0e51e2cf3ddc1396dd"} Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.673752 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:37 crc kubenswrapper[4756]: E0318 14:03:37.674959 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:38.174938964 +0000 UTC m=+219.489356939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.684305 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" event={"ID":"a57b45d3-c455-4b6c-a56e-6884e65670a0","Type":"ContainerStarted","Data":"4fb937fc693bea235c6a8cfae9b3a46c5e29109e643097b79ae71ca9b102e602"} Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.684350 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" event={"ID":"a57b45d3-c455-4b6c-a56e-6884e65670a0","Type":"ContainerStarted","Data":"2189d3a1fcd69079627f6526f088574274030a07ad0050289f6460ad63d9d920"} Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.686566 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.691086 4756 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hp2sc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.696374 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" podUID="a57b45d3-c455-4b6c-a56e-6884e65670a0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.709148 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" event={"ID":"cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c","Type":"ContainerStarted","Data":"cca6a4e37a1263e95b3279c46410d1faebb39dadda939a388fc316664329311c"} Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.717802 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2jfx5" podStartSLOduration=161.717785754 podStartE2EDuration="2m41.717785754s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:37.694319625 +0000 UTC m=+219.008737600" watchObservedRunningTime="2026-03-18 14:03:37.717785754 +0000 UTC m=+219.032203729" Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.771607 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" event={"ID":"d01fc0fa-9be1-4493-be47-c420ebb5341d","Type":"ContainerStarted","Data":"97df421dad3eff6122572977c749baeacd58b16d77f1403c985b71ab2ffbb2ae"} Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.781184 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:37 crc kubenswrapper[4756]: E0318 14:03:37.781780 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:38.281764381 +0000 UTC m=+219.596182396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.837285 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" event={"ID":"66923387-ae05-40a5-9a8b-8de577e30cb1","Type":"ContainerStarted","Data":"56ef3dac0c9703817a04d742162db64a19afc147e75436b054bd30f7b95a62b4"} Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.853673 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" event={"ID":"67ea2f51-e62b-497a-98dc-0835647d0d5f","Type":"ContainerStarted","Data":"5c11999a25226167ab825305c5f99eedd2ada24644bb2a5ea539bbfff81d570a"} Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.855199 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mxm9h" podStartSLOduration=161.855179321 podStartE2EDuration="2m41.855179321s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:37.852312404 +0000 UTC m=+219.166730399" watchObservedRunningTime="2026-03-18 14:03:37.855179321 +0000 UTC m=+219.169597306" Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.884164 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:37 crc kubenswrapper[4756]: E0318 14:03:37.885698 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:38.385676489 +0000 UTC m=+219.700094464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.976384 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" event={"ID":"d3a78872-6429-4e96-af97-024c68c537d7","Type":"ContainerStarted","Data":"6c6943df95463872d63b7ac410559c4937b34b3c859291d7478b3b533730fcc1"} Mar 18 14:03:37 crc kubenswrapper[4756]: I0318 14:03:37.985381 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:37 crc kubenswrapper[4756]: E0318 14:03:37.985746 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:38.485735204 +0000 UTC m=+219.800153179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.005548 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-77tzj" event={"ID":"cfd097dc-6461-4bf5-8952-2a0ebd249f35","Type":"ContainerStarted","Data":"d4514d369c3ea384f9d95c3417ce17b2cc4e3a0bef24cbd1b2004cb936274117"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.020235 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" podStartSLOduration=162.020216389 podStartE2EDuration="2m42.020216389s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:37.961831002 +0000 UTC m=+219.276248977" watchObservedRunningTime="2026-03-18 14:03:38.020216389 +0000 UTC m=+219.334634364" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.021083 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" event={"ID":"f7c7f16d-c4d5-4a88-ac61-6de2735da80c","Type":"ContainerStarted","Data":"a5cf2176f83579219e500c876ae11966a4d6663f8f7175b405b96cc618e5be19"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.021107 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" event={"ID":"f7c7f16d-c4d5-4a88-ac61-6de2735da80c","Type":"ContainerStarted","Data":"fd2903c5b5333112626ce4d16f40745ed9fee5422ca7ce99cfde32984b65fc08"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.038457 4756 generic.go:334] "Generic (PLEG): container finished" podID="37777140-c5a3-4503-abbd-0d316a535630" containerID="20ae69ed811f3fb69b613a8059633b8bab184d96119c2197272b2c3e28ec9eff" exitCode=0 Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.038548 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" event={"ID":"37777140-c5a3-4503-abbd-0d316a535630","Type":"ContainerDied","Data":"20ae69ed811f3fb69b613a8059633b8bab184d96119c2197272b2c3e28ec9eff"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.068399 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vxnd9" event={"ID":"5783835a-afb1-41e5-b868-2fc94a44b7f6","Type":"ContainerStarted","Data":"73f774ad8b881d3ca3e3a6b9480c96ab79b2b644e06ba1ceaaa518329310b101"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.091312 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" event={"ID":"9b93428b-357a-4f73-ba6f-c46a4c475a98","Type":"ContainerStarted","Data":"d20a98abe6348f84b155269a40964e4ab3ff7ae9bdf7dcbdcbac46eabb6b30cc"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.091357 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" event={"ID":"9b93428b-357a-4f73-ba6f-c46a4c475a98","Type":"ContainerStarted","Data":"5fb8fd026ec996760dbbedd98d3cd0fe2deac551821fbaa519b93d6ca131d0ac"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.091390 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:38 crc kubenswrapper[4756]: E0318 14:03:38.091798 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:38.591777789 +0000 UTC m=+219.906195774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.092249 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:38 crc kubenswrapper[4756]: E0318 14:03:38.094092 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:38.594081861 +0000 UTC m=+219.908499836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.125558 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" event={"ID":"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2","Type":"ContainerStarted","Data":"c330c07d871ed30ac915eb719f9a7f906f4f5aff091f63c035cfc950c3d0241f"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.127267 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.130563 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-588c8" podStartSLOduration=162.13054711 podStartE2EDuration="2m42.13054711s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:38.020819945 +0000 UTC m=+219.335237920" watchObservedRunningTime="2026-03-18 14:03:38.13054711 +0000 UTC m=+219.444965085" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.159769 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" event={"ID":"259f3fb2-856e-4885-8955-6703997715f9","Type":"ContainerStarted","Data":"8e78eb27604c87173be39eea751ae16e82c1137f9734d8b2c5457993f2ec1bd2"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.159824 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" event={"ID":"259f3fb2-856e-4885-8955-6703997715f9","Type":"ContainerStarted","Data":"7560c5897dc89debdd446273aa7aab4a6b458d0bf1933f7ad22786c3a9ba5b67"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.178505 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.192442 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" event={"ID":"67423a72-d377-414a-b293-56f824df4d45","Type":"ContainerStarted","Data":"b0bc083a35bf0b1b8d33e2b1e756735dc32aca8d3ab71367011c42c9bcfcf346"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.192906 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:38 crc kubenswrapper[4756]: E0318 14:03:38.193876 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:38.693861699 +0000 UTC m=+220.008279674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.210610 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:38 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:38 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:38 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.210931 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.223611 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k5xg9" event={"ID":"f88a8bdd-954f-455c-aad1-03b1988afa37","Type":"ContainerStarted","Data":"22e3b1e94cef2835a0cfbc2dc20b571b8dc0198acbfab345e7e9b7ebba9d182c"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.226489 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" event={"ID":"7de9ef01-2f54-487b-b0f6-33585560ee17","Type":"ContainerStarted","Data":"f3fe941363e86827c953e61f25609e1846120619d388d0f59df80e984d49fd73"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.246648 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54jrv" podStartSLOduration=162.246624685 podStartE2EDuration="2m42.246624685s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:38.174983602 +0000 UTC m=+219.489401577" watchObservedRunningTime="2026-03-18 14:03:38.246624685 +0000 UTC m=+219.561042670" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.247194 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9klwh" event={"ID":"6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a","Type":"ContainerStarted","Data":"0927f1552e87f67c5907615e03c00abefad2472926ca42efbd403650557e93dc"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.266617 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" event={"ID":"487d1c97-b703-4f1b-8c77-c23b4366a467","Type":"ContainerStarted","Data":"08d1ec45761c189938b8c75e287b3fa3027fce6b571256842d1be256be20d3d3"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.267602 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.268831 4756 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4ckd5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/healthz\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.268879 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" podUID="487d1c97-b703-4f1b-8c77-c23b4366a467" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.12:8080/healthz\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.285147 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" podStartSLOduration=162.285102487 podStartE2EDuration="2m42.285102487s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:38.235028704 +0000 UTC m=+219.549446679" watchObservedRunningTime="2026-03-18 14:03:38.285102487 +0000 UTC m=+219.599520452" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.292414 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" event={"ID":"ff89c47d-35b0-46b1-ad37-cbc600a377a1","Type":"ContainerStarted","Data":"a3763b05c0ac546f01500623a7d116144ef894b16781c60029f61bf1e43051de"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.302911 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:38 crc kubenswrapper[4756]: E0318 14:03:38.304283 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:38.804268481 +0000 UTC m=+220.118686456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.351573 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kxcdf" podStartSLOduration=162.35155047 podStartE2EDuration="2m42.35155047s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:38.2784843 +0000 UTC m=+219.592902285" watchObservedRunningTime="2026-03-18 14:03:38.35155047 +0000 UTC m=+219.665968445" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.352799 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tc72q" event={"ID":"f25798a1-191b-4db2-86f5-3d49abb6de78","Type":"ContainerStarted","Data":"01eb057807391fdcb105d12dd16fff3ebbba266ea0fc80f3f9af95f1dd6642f6"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.378417 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mm2qg" event={"ID":"e57e76f1-bc32-4531-bdad-9c2f929c13f1","Type":"ContainerStarted","Data":"c68530e23fc31565bf848cc0ac9e48f0df3d26ba5d615ad1918a6447e738ebb0"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.392807 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-2ptxb" podStartSLOduration=162.392791977 podStartE2EDuration="2m42.392791977s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:38.390998309 +0000 UTC m=+219.705416294" watchObservedRunningTime="2026-03-18 14:03:38.392791977 +0000 UTC m=+219.707209952" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.408763 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2xktt" podStartSLOduration=162.408744535 podStartE2EDuration="2m42.408744535s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:38.362278768 +0000 UTC m=+219.676696743" watchObservedRunningTime="2026-03-18 14:03:38.408744535 +0000 UTC m=+219.723162510" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.407753 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:38 crc kubenswrapper[4756]: E0318 14:03:38.407823 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:38.907803469 +0000 UTC m=+220.222221444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.418108 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:38 crc kubenswrapper[4756]: E0318 14:03:38.418694 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:38.918678091 +0000 UTC m=+220.233096066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.422666 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" podStartSLOduration=162.422651308 podStartE2EDuration="2m42.422651308s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:38.420255964 +0000 UTC m=+219.734673939" watchObservedRunningTime="2026-03-18 14:03:38.422651308 +0000 UTC m=+219.737069283" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.429990 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" event={"ID":"f6816a11-7f42-40ca-9bbe-ec5f3d0d019a","Type":"ContainerStarted","Data":"16eecc0b0593b66954bf1149db6b70e9aeeb85d00f1caa9ade1dd1cb64c45a1c"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.472072 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-k5xg9" podStartSLOduration=162.472048083 podStartE2EDuration="2m42.472048083s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:38.471564691 +0000 UTC m=+219.785982666" watchObservedRunningTime="2026-03-18 14:03:38.472048083 +0000 UTC m=+219.786466058" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.517510 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4" event={"ID":"be0a5a79-545f-4411-8cb1-d9de4e87d983","Type":"ContainerStarted","Data":"b6892c10919b4e2259d927b2a797f410dd09b371c4b7b67d72d9dfa5ad1b4647"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.518798 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:38 crc kubenswrapper[4756]: E0318 14:03:38.519388 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:39.019361483 +0000 UTC m=+220.333779458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.529833 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd" event={"ID":"b811256d-cc06-4ab8-a482-5ea91528ad95","Type":"ContainerStarted","Data":"268ba441c4dd03e026d818cbe37abdef0c8b6b919038f2e865fb62defd9e6a66"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.536818 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wvqhb" event={"ID":"e2377525-7cea-46ad-821e-834cbb9b9591","Type":"ContainerStarted","Data":"ab54f2cb392c0d8b59dcd8213e8151fccd3d848e68fa761cad8efbec3779a805"} Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.538880 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tc72q" podStartSLOduration=6.538862927 podStartE2EDuration="6.538862927s" podCreationTimestamp="2026-03-18 14:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:38.532486816 +0000 UTC m=+219.846904791" watchObservedRunningTime="2026-03-18 14:03:38.538862927 +0000 UTC m=+219.853280902" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.560793 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kld5c" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.574882 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9klwh" podStartSLOduration=6.574854183 podStartE2EDuration="6.574854183s" podCreationTimestamp="2026-03-18 14:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:38.565295595 +0000 UTC m=+219.879713560" watchObservedRunningTime="2026-03-18 14:03:38.574854183 +0000 UTC m=+219.889272158" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.575778 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.620657 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:38 crc kubenswrapper[4756]: E0318 14:03:38.624773 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:39.124759272 +0000 UTC m=+220.439177247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.729609 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:38 crc kubenswrapper[4756]: E0318 14:03:38.729893 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:39.229869942 +0000 UTC m=+220.544287917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.730144 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:38 crc kubenswrapper[4756]: E0318 14:03:38.730413 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:39.230383875 +0000 UTC m=+220.544801840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.769248 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4" podStartSLOduration=162.769231238 podStartE2EDuration="2m42.769231238s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:38.767999656 +0000 UTC m=+220.082417631" watchObservedRunningTime="2026-03-18 14:03:38.769231238 +0000 UTC m=+220.083649213" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.772967 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd" podStartSLOduration=162.772951849 podStartE2EDuration="2m42.772951849s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:38.711406697 +0000 UTC m=+220.025824672" watchObservedRunningTime="2026-03-18 14:03:38.772951849 +0000 UTC m=+220.087369824" Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.833919 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:38 crc kubenswrapper[4756]: E0318 14:03:38.834253 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:39.334238782 +0000 UTC m=+220.648656747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:38 crc kubenswrapper[4756]: I0318 14:03:38.938098 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:38 crc kubenswrapper[4756]: E0318 14:03:38.938716 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:39.438699016 +0000 UTC m=+220.753116991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.039391 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:39 crc kubenswrapper[4756]: E0318 14:03:39.039939 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:39.539912522 +0000 UTC m=+220.854330487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.040293 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:39 crc kubenswrapper[4756]: E0318 14:03:39.041831 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:39.541816073 +0000 UTC m=+220.856234048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.145095 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:39 crc kubenswrapper[4756]: E0318 14:03:39.145501 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:39.645482765 +0000 UTC m=+220.959900740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.211032 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:39 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:39 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:39 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.211099 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.247980 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:39 crc kubenswrapper[4756]: E0318 14:03:39.248612 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:39.748600311 +0000 UTC m=+221.063018286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.349490 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:39 crc kubenswrapper[4756]: E0318 14:03:39.349863 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:39.849848838 +0000 UTC m=+221.164266813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.351047 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bxxfm" podStartSLOduration=163.35102777 podStartE2EDuration="2m43.35102777s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:38.881452679 +0000 UTC m=+220.195870654" watchObservedRunningTime="2026-03-18 14:03:39.35102777 +0000 UTC m=+220.665445745" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.452079 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:39 crc kubenswrapper[4756]: E0318 14:03:39.452450 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:39.952434371 +0000 UTC m=+221.266852356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.556870 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:39 crc kubenswrapper[4756]: E0318 14:03:39.557269 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:40.057251265 +0000 UTC m=+221.371669240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.594005 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9klwh" event={"ID":"6b79eb24-4a9c-4f0f-829c-3c9c0d2c5a5a","Type":"ContainerStarted","Data":"53fa0786d4b95760bb4a9c83216f7e74fbe90eb169f5eec8ef9dfe05bce2d973"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.604503 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" event={"ID":"d3f5ac66-56e5-4477-ac1b-1ef496242243","Type":"ContainerStarted","Data":"55017121b61bac574aa0603c66a1c0c5e0720b60d3fa0d1712ea5970acfd5e14"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.604841 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.606200 4756 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-9sqqb container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" start-of-body= Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.606260 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" podUID="d3f5ac66-56e5-4477-ac1b-1ef496242243" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.607808 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" event={"ID":"67423a72-d377-414a-b293-56f824df4d45","Type":"ContainerStarted","Data":"55949f45d585546751deec05bdddecd1d7e5df916e7bfb05dfeb2915d67073f9"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.622001 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" event={"ID":"37777140-c5a3-4503-abbd-0d316a535630","Type":"ContainerStarted","Data":"0affe4c3b675df99e2b5e6b612b8d1bddae130bcbe391d31bab3b7f0425f581a"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.622533 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.641420 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" event={"ID":"c7067572-3edd-48f1-a03d-7f83887ab9ca","Type":"ContainerStarted","Data":"91d48fe1437689c12e09be755c84d4fc81082eb2e8b35886e922e5ae4d7527f6"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.653157 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" podStartSLOduration=163.653108747 podStartE2EDuration="2m43.653108747s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:39.65062428 +0000 UTC m=+220.965042255" watchObservedRunningTime="2026-03-18 14:03:39.653108747 +0000 UTC m=+220.967526712" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.655277 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dtbtw" event={"ID":"32ba09f4-59d7-469c-a882-5564e653e868","Type":"ContainerStarted","Data":"d75d7c744f935eec1c7911e14315d41920c2c895a054d83cfdd53e1e128730a2"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.656373 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dtbtw" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.658223 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:39 crc kubenswrapper[4756]: E0318 14:03:39.660019 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:40.160002051 +0000 UTC m=+221.474420026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.664575 4756 patch_prober.go:28] interesting pod/downloads-7954f5f757-dtbtw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.664634 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dtbtw" podUID="32ba09f4-59d7-469c-a882-5564e653e868" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.668795 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mm2qg" event={"ID":"e57e76f1-bc32-4531-bdad-9c2f929c13f1","Type":"ContainerStarted","Data":"245a30f40f1d6191340d10125e921ed79f4e72f924bb0c510d847c0348accf6e"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.681739 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vxnd9" event={"ID":"5783835a-afb1-41e5-b868-2fc94a44b7f6","Type":"ContainerStarted","Data":"4c280856184163fe746f19a4d138a97365f251670e7021166b604a3f7ca265d7"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.689178 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" podStartSLOduration=163.689162894 podStartE2EDuration="2m43.689162894s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:39.687493969 +0000 UTC m=+221.001911944" watchObservedRunningTime="2026-03-18 14:03:39.689162894 +0000 UTC m=+221.003580869" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.697167 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" event={"ID":"cc7ae5dd-fe5a-47b7-b9bf-a3e86c39be3c","Type":"ContainerStarted","Data":"8ce1ee97d090528d0055cf4a9e3a04ebd4632de11019015869cfc879d1a9d8a9"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.697322 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.719394 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" event={"ID":"d01fc0fa-9be1-4493-be47-c420ebb5341d","Type":"ContainerStarted","Data":"0092a02292b1e13b8a9aa4236a7493aa52151bb9f3e7d6981a18e2d4f053f9b7"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.719447 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" event={"ID":"d01fc0fa-9be1-4493-be47-c420ebb5341d","Type":"ContainerStarted","Data":"bba0acb20af8a4ef2695bc0c12ed968f87cb00710dacda6652546d307f49bdf6"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.721379 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2gbt5" podStartSLOduration=163.721367898 podStartE2EDuration="2m43.721367898s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:39.718023738 +0000 UTC m=+221.032441713" watchObservedRunningTime="2026-03-18 14:03:39.721367898 +0000 UTC m=+221.035785873" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.729373 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" event={"ID":"31db1d26-2d05-489e-9177-580797f8897c","Type":"ContainerStarted","Data":"3bddb0496b08fed63d21de430db606e49d46cfdd6c4819ab0775c8c6e5033594"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.758347 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" event={"ID":"ff89c47d-35b0-46b1-ad37-cbc600a377a1","Type":"ContainerStarted","Data":"cb9b9600f8c9516241e2ce5dcbb9bb0ad2039e1ff0d54c7ca21559f7ad7fc25c"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.759366 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" event={"ID":"ff89c47d-35b0-46b1-ad37-cbc600a377a1","Type":"ContainerStarted","Data":"ff0df17c24fb8413d6ca7746634b4f26cba1d573de02ea15a2f69f995ecaf646"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.759599 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:39 crc kubenswrapper[4756]: E0318 14:03:39.759854 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:40.25983544 +0000 UTC m=+221.574253415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.759945 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:39 crc kubenswrapper[4756]: E0318 14:03:39.760824 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:40.260815816 +0000 UTC m=+221.575233791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.787255 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k5xg9" event={"ID":"f88a8bdd-954f-455c-aad1-03b1988afa37","Type":"ContainerStarted","Data":"6977df527fac0f6501afd430cd14031bd2aefe17d2553bfa0d03e71bea89d5b4"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.798796 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fkf55" podStartSLOduration=163.798768185 podStartE2EDuration="2m43.798768185s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:39.793532355 +0000 UTC m=+221.107950330" watchObservedRunningTime="2026-03-18 14:03:39.798768185 +0000 UTC m=+221.113186150" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.800485 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-77tzj" event={"ID":"cfd097dc-6461-4bf5-8952-2a0ebd249f35","Type":"ContainerStarted","Data":"ed0a781bad4861871979fd963737f8b1bb060965c174b0e2169449ec17f8adad"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.800820 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.814875 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" podStartSLOduration=163.814853326 podStartE2EDuration="2m43.814853326s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:39.775251474 +0000 UTC m=+221.089669449" watchObservedRunningTime="2026-03-18 14:03:39.814853326 +0000 UTC m=+221.129271301" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.816879 4756 patch_prober.go:28] interesting pod/console-operator-58897d9998-77tzj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.816922 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-77tzj" podUID="cfd097dc-6461-4bf5-8952-2a0ebd249f35" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/readyz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.834011 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-br5r4" event={"ID":"be0a5a79-545f-4411-8cb1-d9de4e87d983","Type":"ContainerStarted","Data":"d0819480979beadbff45a465ca84416f0ddde51e31c76dc2b7b1d131218038d6"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.855236 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" podStartSLOduration=163.85520833 podStartE2EDuration="2m43.85520833s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:39.850257877 +0000 UTC m=+221.164675852" watchObservedRunningTime="2026-03-18 14:03:39.85520833 +0000 UTC m=+221.169626305" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.867176 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-prmvd" event={"ID":"b811256d-cc06-4ab8-a482-5ea91528ad95","Type":"ContainerStarted","Data":"7e53cec7dfa90cdc7956225c6b5de01de7804ca34f0a89058da477a199ac1d78"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.868513 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:39 crc kubenswrapper[4756]: E0318 14:03:39.884501 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:40.384483005 +0000 UTC m=+221.698900980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.901306 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dtbtw" podStartSLOduration=163.901292556 podStartE2EDuration="2m43.901292556s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:39.879309596 +0000 UTC m=+221.193727581" watchObservedRunningTime="2026-03-18 14:03:39.901292556 +0000 UTC m=+221.215710531" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.904313 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" event={"ID":"7de9ef01-2f54-487b-b0f6-33585560ee17","Type":"ContainerStarted","Data":"c716d450d9ddeb0f49d1721131466d8e93ec5dd80ac8a50bac5d80a28c215889"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.906198 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qd587" event={"ID":"e79e3bba-d970-4daf-983d-aa4894129506","Type":"ContainerStarted","Data":"79390a8b83675693b577860644b0391ee1b51e8c213ea80d8e75291b63c54a62"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.906240 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qd587" event={"ID":"e79e3bba-d970-4daf-983d-aa4894129506","Type":"ContainerStarted","Data":"0e6cdb66ae699cc9822738beaddfecfbe30a5ea0422682187e1619ddc138c729"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.909905 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" event={"ID":"5f7750ad-9fab-43ab-bcf2-fc8eaa797013","Type":"ContainerStarted","Data":"66610eb448434454da5ef5da645cfd6b1802f5b3d465fce9bc1720454f6b6ceb"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.910261 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.911834 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.919086 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tc72q" event={"ID":"f25798a1-191b-4db2-86f5-3d49abb6de78","Type":"ContainerStarted","Data":"2fe465ae710a4498ce4d0ef0ec50873af3857e5a55d394f57e3094acff07ed2e"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.919303 4756 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-ml5m2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.919339 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" podUID="7de9ef01-2f54-487b-b0f6-33585560ee17" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.925464 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.925779 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.930134 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" event={"ID":"d3a78872-6429-4e96-af97-024c68c537d7","Type":"ContainerStarted","Data":"b93a9690bead713d92fd36b59ca4193de0e732d44e89b9bc30385cd6cc020515"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.934725 4756 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-j87k5 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.934799 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" podUID="c7067572-3edd-48f1-a03d-7f83887ab9ca" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.935790 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-49g5j" podStartSLOduration=163.935775542 podStartE2EDuration="2m43.935775542s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:39.935329959 +0000 UTC m=+221.249747944" watchObservedRunningTime="2026-03-18 14:03:39.935775542 +0000 UTC m=+221.250193507" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.962917 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" podStartSLOduration=163.962893989 podStartE2EDuration="2m43.962893989s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:39.962255452 +0000 UTC m=+221.276673437" watchObservedRunningTime="2026-03-18 14:03:39.962893989 +0000 UTC m=+221.277311964" Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.969209 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" event={"ID":"21001d0e-8d77-4fa1-81b4-7867be45f92a","Type":"ContainerStarted","Data":"3334cab820bd196ee39787c682c97fc6b91a9e8ac5e677ed0d03b45906f2b8ec"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.969258 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" event={"ID":"21001d0e-8d77-4fa1-81b4-7867be45f92a","Type":"ContainerStarted","Data":"82a9e5ed996eb1d76e334950e7eacf9e95d72114b5b36204a47fc84cc340ae1c"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.970852 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:39 crc kubenswrapper[4756]: E0318 14:03:39.973077 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:40.473059832 +0000 UTC m=+221.787477807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.985106 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wvqhb" event={"ID":"e2377525-7cea-46ad-821e-834cbb9b9591","Type":"ContainerStarted","Data":"84458fc7f11c00651a67c93422516a9a860369a0ed2e78ea4987c2aedd95f20b"} Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.987784 4756 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4ckd5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/healthz\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 18 14:03:39 crc kubenswrapper[4756]: I0318 14:03:39.987838 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" podUID="487d1c97-b703-4f1b-8c77-c23b4366a467" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.12:8080/healthz\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.035155 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-9s7gz" podStartSLOduration=164.035136418 podStartE2EDuration="2m44.035136418s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:39.998775732 +0000 UTC m=+221.313193697" watchObservedRunningTime="2026-03-18 14:03:40.035136418 +0000 UTC m=+221.349554393" Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.052581 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hp2sc" Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.071621 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:40 crc kubenswrapper[4756]: E0318 14:03:40.073201 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:40.573185809 +0000 UTC m=+221.887603784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.095316 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" podStartSLOduration=164.095301893 podStartE2EDuration="2m44.095301893s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:40.084981045 +0000 UTC m=+221.399399020" watchObservedRunningTime="2026-03-18 14:03:40.095301893 +0000 UTC m=+221.409719868" Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.095994 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-77tzj" podStartSLOduration=164.095990481 podStartE2EDuration="2m44.095990481s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:40.038721904 +0000 UTC m=+221.353139879" watchObservedRunningTime="2026-03-18 14:03:40.095990481 +0000 UTC m=+221.410408456" Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.155316 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" podStartSLOduration=164.155297952 podStartE2EDuration="2m44.155297952s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:40.154975544 +0000 UTC m=+221.469393529" watchObservedRunningTime="2026-03-18 14:03:40.155297952 +0000 UTC m=+221.469715927" Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.175937 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:40 crc kubenswrapper[4756]: E0318 14:03:40.176340 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:40.676326497 +0000 UTC m=+221.990744472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.184833 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qd587" podStartSLOduration=164.184812625 podStartE2EDuration="2m44.184812625s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:40.184043083 +0000 UTC m=+221.498461058" watchObservedRunningTime="2026-03-18 14:03:40.184812625 +0000 UTC m=+221.499230600" Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.201344 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:40 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:40 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:40 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.201399 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.278008 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:40 crc kubenswrapper[4756]: E0318 14:03:40.279106 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:40.779079984 +0000 UTC m=+222.093497969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.382833 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:40 crc kubenswrapper[4756]: E0318 14:03:40.383611 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:40.883596578 +0000 UTC m=+222.198014563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.393256 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-npgj8" podStartSLOduration=164.393241327 podStartE2EDuration="2m44.393241327s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:40.282166136 +0000 UTC m=+221.596584121" watchObservedRunningTime="2026-03-18 14:03:40.393241327 +0000 UTC m=+221.707659302" Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.394766 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-wvqhb" podStartSLOduration=164.394757357 podStartE2EDuration="2m44.394757357s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:40.392631971 +0000 UTC m=+221.707049966" watchObservedRunningTime="2026-03-18 14:03:40.394757357 +0000 UTC m=+221.709175332" Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.485701 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:40 crc kubenswrapper[4756]: E0318 14:03:40.486042 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:40.986027677 +0000 UTC m=+222.300445652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.588803 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:40 crc kubenswrapper[4756]: E0318 14:03:40.589875 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:41.089849373 +0000 UTC m=+222.404267348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.692698 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:40 crc kubenswrapper[4756]: E0318 14:03:40.693500 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:41.193478314 +0000 UTC m=+222.507896289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.794134 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:40 crc kubenswrapper[4756]: E0318 14:03:40.796191 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:41.296163589 +0000 UTC m=+222.610581554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.895200 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:40 crc kubenswrapper[4756]: E0318 14:03:40.895309 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:41.395293299 +0000 UTC m=+222.709711274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.895585 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:40 crc kubenswrapper[4756]: E0318 14:03:40.896005 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:41.395849494 +0000 UTC m=+222.710267459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.959890 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.960672 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.964686 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.965278 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.973760 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 14:03:40 crc kubenswrapper[4756]: I0318 14:03:40.996307 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:41 crc kubenswrapper[4756]: E0318 14:03:41.000597 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:41.500572115 +0000 UTC m=+222.814990100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.039069 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" event={"ID":"66923387-ae05-40a5-9a8b-8de577e30cb1","Type":"ContainerStarted","Data":"4e3fb77df69f99cef13b2e2fc19bc23488d21bccfe68a92cdfe32047ad04c603"} Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.074423 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mm2qg" event={"ID":"e57e76f1-bc32-4531-bdad-9c2f929c13f1","Type":"ContainerStarted","Data":"037d900f8bdc2eea4684049cddf2c9e7bfaa2cd22c694394e4384faef308cf3c"} Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.101983 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76bea747-b62f-4749-b01a-207a594758e1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"76bea747-b62f-4749-b01a-207a594758e1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.102031 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76bea747-b62f-4749-b01a-207a594758e1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"76bea747-b62f-4749-b01a-207a594758e1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.102074 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:41 crc kubenswrapper[4756]: E0318 14:03:41.102432 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:41.602420137 +0000 UTC m=+222.916838112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.102774 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vxnd9" event={"ID":"5783835a-afb1-41e5-b868-2fc94a44b7f6","Type":"ContainerStarted","Data":"3cd4281a254732ce1e657ed4ff284b65ab43a73fb1c888d6266e1893e4e349a9"} Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.103232 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-vxnd9" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.120138 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mm2qg" podStartSLOduration=165.120102781 podStartE2EDuration="2m45.120102781s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:41.113935226 +0000 UTC m=+222.428353201" watchObservedRunningTime="2026-03-18 14:03:41.120102781 +0000 UTC m=+222.434520756" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.138406 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wvqhb" event={"ID":"e2377525-7cea-46ad-821e-834cbb9b9591","Type":"ContainerStarted","Data":"84c081b4038c182201c88f2f0ae1d0bf6adde405366bf5cd4c98151e72ce98cd"} Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.146193 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lkr5r"] Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.147026 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.150774 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.165689 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" event={"ID":"5f7750ad-9fab-43ab-bcf2-fc8eaa797013","Type":"ContainerStarted","Data":"40a658eb0c5e4c68075f95e668d1c7ea0cb15d97a3a253090067c672fe71155f"} Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.166247 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vxnd9" podStartSLOduration=9.16622802 podStartE2EDuration="9.16622802s" podCreationTimestamp="2026-03-18 14:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:41.163814005 +0000 UTC m=+222.478231980" watchObservedRunningTime="2026-03-18 14:03:41.16622802 +0000 UTC m=+222.480645995" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.170018 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lkr5r"] Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.170913 4756 patch_prober.go:28] interesting pod/downloads-7954f5f757-dtbtw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.170987 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dtbtw" podUID="32ba09f4-59d7-469c-a882-5564e653e868" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.192374 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.201872 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ml5m2" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.206138 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.206490 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48602255-9809-498e-9c4a-6053ba5ff591-catalog-content\") pod \"community-operators-lkr5r\" (UID: \"48602255-9809-498e-9c4a-6053ba5ff591\") " pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.206532 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76bea747-b62f-4749-b01a-207a594758e1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"76bea747-b62f-4749-b01a-207a594758e1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.206570 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48602255-9809-498e-9c4a-6053ba5ff591-utilities\") pod \"community-operators-lkr5r\" (UID: \"48602255-9809-498e-9c4a-6053ba5ff591\") " pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.206601 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76bea747-b62f-4749-b01a-207a594758e1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"76bea747-b62f-4749-b01a-207a594758e1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.206663 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb2h9\" (UniqueName: \"kubernetes.io/projected/48602255-9809-498e-9c4a-6053ba5ff591-kube-api-access-zb2h9\") pod \"community-operators-lkr5r\" (UID: \"48602255-9809-498e-9c4a-6053ba5ff591\") " pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:03:41 crc kubenswrapper[4756]: E0318 14:03:41.206805 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:41.706789828 +0000 UTC m=+223.021207803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.207355 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:41 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:41 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:41 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.207410 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.208394 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76bea747-b62f-4749-b01a-207a594758e1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"76bea747-b62f-4749-b01a-207a594758e1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.255570 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76bea747-b62f-4749-b01a-207a594758e1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"76bea747-b62f-4749-b01a-207a594758e1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.287940 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-95vm5"] Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.288214 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" podUID="42d7ef3d-2c3b-455c-9457-44441f1bfcff" containerName="controller-manager" containerID="cri-o://afe0d69f73678372c93451257c7d5664926ee3689192b71e6d9cbd44bd82813a" gracePeriod=30 Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.298410 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.308555 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48602255-9809-498e-9c4a-6053ba5ff591-catalog-content\") pod \"community-operators-lkr5r\" (UID: \"48602255-9809-498e-9c4a-6053ba5ff591\") " pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.308615 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48602255-9809-498e-9c4a-6053ba5ff591-utilities\") pod \"community-operators-lkr5r\" (UID: \"48602255-9809-498e-9c4a-6053ba5ff591\") " pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.308883 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb2h9\" (UniqueName: \"kubernetes.io/projected/48602255-9809-498e-9c4a-6053ba5ff591-kube-api-access-zb2h9\") pod \"community-operators-lkr5r\" (UID: \"48602255-9809-498e-9c4a-6053ba5ff591\") " pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.308905 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.311885 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48602255-9809-498e-9c4a-6053ba5ff591-catalog-content\") pod \"community-operators-lkr5r\" (UID: \"48602255-9809-498e-9c4a-6053ba5ff591\") " pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.316857 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48602255-9809-498e-9c4a-6053ba5ff591-utilities\") pod \"community-operators-lkr5r\" (UID: \"48602255-9809-498e-9c4a-6053ba5ff591\") " pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:03:41 crc kubenswrapper[4756]: E0318 14:03:41.326606 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:41.826592653 +0000 UTC m=+223.141010628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.394797 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v"] Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.414306 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:41 crc kubenswrapper[4756]: E0318 14:03:41.414977 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:41.914958644 +0000 UTC m=+223.229376619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.397881 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb2h9\" (UniqueName: \"kubernetes.io/projected/48602255-9809-498e-9c4a-6053ba5ff591-kube-api-access-zb2h9\") pod \"community-operators-lkr5r\" (UID: \"48602255-9809-498e-9c4a-6053ba5ff591\") " pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.428098 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w28cp"] Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.432137 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.436683 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.457707 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w28cp"] Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.477585 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.510613 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-77tzj" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.518499 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22d5s\" (UniqueName: \"kubernetes.io/projected/7bb3189f-716d-4fef-b885-3a031a60d981-kube-api-access-22d5s\") pod \"certified-operators-w28cp\" (UID: \"7bb3189f-716d-4fef-b885-3a031a60d981\") " pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.518547 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb3189f-716d-4fef-b885-3a031a60d981-catalog-content\") pod \"certified-operators-w28cp\" (UID: \"7bb3189f-716d-4fef-b885-3a031a60d981\") " pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.518564 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb3189f-716d-4fef-b885-3a031a60d981-utilities\") pod \"certified-operators-w28cp\" (UID: \"7bb3189f-716d-4fef-b885-3a031a60d981\") " pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.518589 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:41 crc kubenswrapper[4756]: E0318 14:03:41.518858 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:42.018845012 +0000 UTC m=+223.333262987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.559267 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tfqzk"] Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.560186 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.583496 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tfqzk"] Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.623166 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.623413 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb3189f-716d-4fef-b885-3a031a60d981-catalog-content\") pod \"certified-operators-w28cp\" (UID: \"7bb3189f-716d-4fef-b885-3a031a60d981\") " pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.623451 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb3189f-716d-4fef-b885-3a031a60d981-utilities\") pod \"certified-operators-w28cp\" (UID: \"7bb3189f-716d-4fef-b885-3a031a60d981\") " pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.623500 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzbbt\" (UniqueName: \"kubernetes.io/projected/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-kube-api-access-vzbbt\") pod \"community-operators-tfqzk\" (UID: \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\") " pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.623546 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-utilities\") pod \"community-operators-tfqzk\" (UID: \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\") " pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.623568 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-catalog-content\") pod \"community-operators-tfqzk\" (UID: \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\") " pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.623602 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22d5s\" (UniqueName: \"kubernetes.io/projected/7bb3189f-716d-4fef-b885-3a031a60d981-kube-api-access-22d5s\") pod \"certified-operators-w28cp\" (UID: \"7bb3189f-716d-4fef-b885-3a031a60d981\") " pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:03:41 crc kubenswrapper[4756]: E0318 14:03:41.623900 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:42.12388682 +0000 UTC m=+223.438304795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.624548 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb3189f-716d-4fef-b885-3a031a60d981-catalog-content\") pod \"certified-operators-w28cp\" (UID: \"7bb3189f-716d-4fef-b885-3a031a60d981\") " pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.625412 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb3189f-716d-4fef-b885-3a031a60d981-utilities\") pod \"certified-operators-w28cp\" (UID: \"7bb3189f-716d-4fef-b885-3a031a60d981\") " pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.650884 4756 ???:1] "http: TLS handshake error from 192.168.126.11:47196: no serving certificate available for the kubelet" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.681485 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.683088 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22d5s\" (UniqueName: \"kubernetes.io/projected/7bb3189f-716d-4fef-b885-3a031a60d981-kube-api-access-22d5s\") pod \"certified-operators-w28cp\" (UID: \"7bb3189f-716d-4fef-b885-3a031a60d981\") " pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.724912 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.724971 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzbbt\" (UniqueName: \"kubernetes.io/projected/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-kube-api-access-vzbbt\") pod \"community-operators-tfqzk\" (UID: \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\") " pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.725019 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-utilities\") pod \"community-operators-tfqzk\" (UID: \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\") " pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.725042 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-catalog-content\") pod \"community-operators-tfqzk\" (UID: \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\") " pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:03:41 crc kubenswrapper[4756]: E0318 14:03:41.725416 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:42.225404464 +0000 UTC m=+223.539822439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.725752 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-utilities\") pod \"community-operators-tfqzk\" (UID: \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\") " pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.725809 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-catalog-content\") pod \"community-operators-tfqzk\" (UID: \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\") " pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.785144 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jmwzc"] Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.786311 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.797046 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzbbt\" (UniqueName: \"kubernetes.io/projected/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-kube-api-access-vzbbt\") pod \"community-operators-tfqzk\" (UID: \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\") " pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.818980 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jmwzc"] Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.828421 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.829648 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.829847 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c76033c1-1ccb-42ce-ade9-f46428bc0b46-catalog-content\") pod \"certified-operators-jmwzc\" (UID: \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\") " pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.829889 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c76033c1-1ccb-42ce-ade9-f46428bc0b46-utilities\") pod \"certified-operators-jmwzc\" (UID: \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\") " pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.829945 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gcxc\" (UniqueName: \"kubernetes.io/projected/c76033c1-1ccb-42ce-ade9-f46428bc0b46-kube-api-access-5gcxc\") pod \"certified-operators-jmwzc\" (UID: \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\") " pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:03:41 crc kubenswrapper[4756]: E0318 14:03:41.830039 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:42.330025652 +0000 UTC m=+223.644443627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.901623 4756 ???:1] "http: TLS handshake error from 192.168.126.11:47210: no serving certificate available for the kubelet" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.925288 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.932142 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gcxc\" (UniqueName: \"kubernetes.io/projected/c76033c1-1ccb-42ce-ade9-f46428bc0b46-kube-api-access-5gcxc\") pod \"certified-operators-jmwzc\" (UID: \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\") " pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.932621 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.932642 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c76033c1-1ccb-42ce-ade9-f46428bc0b46-catalog-content\") pod \"certified-operators-jmwzc\" (UID: \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\") " pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.932691 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c76033c1-1ccb-42ce-ade9-f46428bc0b46-utilities\") pod \"certified-operators-jmwzc\" (UID: \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\") " pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.933363 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c76033c1-1ccb-42ce-ade9-f46428bc0b46-utilities\") pod \"certified-operators-jmwzc\" (UID: \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\") " pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.933853 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c76033c1-1ccb-42ce-ade9-f46428bc0b46-catalog-content\") pod \"certified-operators-jmwzc\" (UID: \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\") " pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:03:41 crc kubenswrapper[4756]: E0318 14:03:41.933920 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:42.433903239 +0000 UTC m=+223.748321214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:41 crc kubenswrapper[4756]: I0318 14:03:41.959216 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gcxc\" (UniqueName: \"kubernetes.io/projected/c76033c1-1ccb-42ce-ade9-f46428bc0b46-kube-api-access-5gcxc\") pod \"certified-operators-jmwzc\" (UID: \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\") " pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.034052 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:42 crc kubenswrapper[4756]: E0318 14:03:42.034397 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:42.534374296 +0000 UTC m=+223.848792301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.050455 4756 ???:1] "http: TLS handshake error from 192.168.126.11:47224: no serving certificate available for the kubelet" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.117531 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.135368 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:42 crc kubenswrapper[4756]: E0318 14:03:42.135707 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:42.635694854 +0000 UTC m=+223.950112829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.155512 4756 ???:1] "http: TLS handshake error from 192.168.126.11:47226: no serving certificate available for the kubelet" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.204266 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:42 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:42 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:42 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.204329 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.238972 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:42 crc kubenswrapper[4756]: E0318 14:03:42.239833 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:42.739817208 +0000 UTC m=+224.054235183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.243181 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" event={"ID":"66923387-ae05-40a5-9a8b-8de577e30cb1","Type":"ContainerStarted","Data":"e6232f9ab5c9c56e721b719b9ea800ec5205138192b5e369203110d62c62ca21"} Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.272556 4756 generic.go:334] "Generic (PLEG): container finished" podID="42d7ef3d-2c3b-455c-9457-44441f1bfcff" containerID="afe0d69f73678372c93451257c7d5664926ee3689192b71e6d9cbd44bd82813a" exitCode=0 Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.273520 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" event={"ID":"42d7ef3d-2c3b-455c-9457-44441f1bfcff","Type":"ContainerDied","Data":"afe0d69f73678372c93451257c7d5664926ee3689192b71e6d9cbd44bd82813a"} Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.275350 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" podUID="1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2" containerName="route-controller-manager" containerID="cri-o://c330c07d871ed30ac915eb719f9a7f906f4f5aff091f63c035cfc950c3d0241f" gracePeriod=30 Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.277332 4756 patch_prober.go:28] interesting pod/downloads-7954f5f757-dtbtw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.277366 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dtbtw" podUID="32ba09f4-59d7-469c-a882-5564e653e868" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.287087 4756 ???:1] "http: TLS handshake error from 192.168.126.11:47238: no serving certificate available for the kubelet" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.341224 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:42 crc kubenswrapper[4756]: E0318 14:03:42.342495 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:42.842482903 +0000 UTC m=+224.156900878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.408454 4756 ???:1] "http: TLS handshake error from 192.168.126.11:47242: no serving certificate available for the kubelet" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.451547 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:42 crc kubenswrapper[4756]: E0318 14:03:42.451746 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:42.951717625 +0000 UTC m=+224.266135600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.451808 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:42 crc kubenswrapper[4756]: E0318 14:03:42.452183 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:42.952169657 +0000 UTC m=+224.266587632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.487441 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w28cp"] Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.552807 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:42 crc kubenswrapper[4756]: E0318 14:03:42.553158 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:43.053142456 +0000 UTC m=+224.367560431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.579606 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.586251 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lkr5r"] Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.608637 4756 ???:1] "http: TLS handshake error from 192.168.126.11:47246: no serving certificate available for the kubelet" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.619049 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tfqzk"] Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.654262 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.654939 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:42 crc kubenswrapper[4756]: E0318 14:03:42.655258 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:43.155246966 +0000 UTC m=+224.469664941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.721214 4756 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.729456 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jmwzc"] Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.755921 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-client-ca\") pod \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.755997 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-config\") pod \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.756055 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-proxy-ca-bundles\") pod \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.756208 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.756299 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d7ef3d-2c3b-455c-9457-44441f1bfcff-serving-cert\") pod \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.756331 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btwv7\" (UniqueName: \"kubernetes.io/projected/42d7ef3d-2c3b-455c-9457-44441f1bfcff-kube-api-access-btwv7\") pod \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\" (UID: \"42d7ef3d-2c3b-455c-9457-44441f1bfcff\") " Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.757064 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-client-ca" (OuterVolumeSpecName: "client-ca") pod "42d7ef3d-2c3b-455c-9457-44441f1bfcff" (UID: "42d7ef3d-2c3b-455c-9457-44441f1bfcff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.757791 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "42d7ef3d-2c3b-455c-9457-44441f1bfcff" (UID: "42d7ef3d-2c3b-455c-9457-44441f1bfcff"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:03:42 crc kubenswrapper[4756]: E0318 14:03:42.757940 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:43.257923341 +0000 UTC m=+224.572341316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.758098 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-config" (OuterVolumeSpecName: "config") pod "42d7ef3d-2c3b-455c-9457-44441f1bfcff" (UID: "42d7ef3d-2c3b-455c-9457-44441f1bfcff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.770048 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d7ef3d-2c3b-455c-9457-44441f1bfcff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "42d7ef3d-2c3b-455c-9457-44441f1bfcff" (UID: "42d7ef3d-2c3b-455c-9457-44441f1bfcff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.770365 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d7ef3d-2c3b-455c-9457-44441f1bfcff-kube-api-access-btwv7" (OuterVolumeSpecName: "kube-api-access-btwv7") pod "42d7ef3d-2c3b-455c-9457-44441f1bfcff" (UID: "42d7ef3d-2c3b-455c-9457-44441f1bfcff"). InnerVolumeSpecName "kube-api-access-btwv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.865054 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.865227 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.865241 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.865250 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42d7ef3d-2c3b-455c-9457-44441f1bfcff-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.865258 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42d7ef3d-2c3b-455c-9457-44441f1bfcff-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.865266 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btwv7\" (UniqueName: \"kubernetes.io/projected/42d7ef3d-2c3b-455c-9457-44441f1bfcff-kube-api-access-btwv7\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:42 crc kubenswrapper[4756]: E0318 14:03:42.865495 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:43.365483967 +0000 UTC m=+224.679901942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.965805 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:42 crc kubenswrapper[4756]: E0318 14:03:42.966271 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:43.466248671 +0000 UTC m=+224.780666646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:42 crc kubenswrapper[4756]: I0318 14:03:42.970711 4756 ???:1] "http: TLS handshake error from 192.168.126.11:47250: no serving certificate available for the kubelet" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.067449 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:43 crc kubenswrapper[4756]: E0318 14:03:43.067967 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:43.56794089 +0000 UTC m=+224.882358865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.113166 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.168633 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22sd5\" (UniqueName: \"kubernetes.io/projected/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-kube-api-access-22sd5\") pod \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.169016 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.169047 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-client-ca\") pod \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.169105 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-serving-cert\") pod \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.169142 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-config\") pod \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\" (UID: \"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2\") " Mar 18 14:03:43 crc kubenswrapper[4756]: E0318 14:03:43.170005 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:43.669948287 +0000 UTC m=+224.984366262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.170393 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-config" (OuterVolumeSpecName: "config") pod "1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2" (UID: "1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.170413 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-client-ca" (OuterVolumeSpecName: "client-ca") pod "1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2" (UID: "1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.175191 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2" (UID: "1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.175773 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-kube-api-access-22sd5" (OuterVolumeSpecName: "kube-api-access-22sd5") pod "1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2" (UID: "1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2"). InnerVolumeSpecName "kube-api-access-22sd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.202602 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:43 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:43 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:43 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.202670 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.270596 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.270733 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22sd5\" (UniqueName: \"kubernetes.io/projected/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-kube-api-access-22sd5\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.270746 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.270755 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.270766 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:43 crc kubenswrapper[4756]: E0318 14:03:43.272048 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:43.772026257 +0000 UTC m=+225.086444232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.281446 4756 generic.go:334] "Generic (PLEG): container finished" podID="1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2" containerID="c330c07d871ed30ac915eb719f9a7f906f4f5aff091f63c035cfc950c3d0241f" exitCode=0 Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.281520 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" event={"ID":"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2","Type":"ContainerDied","Data":"c330c07d871ed30ac915eb719f9a7f906f4f5aff091f63c035cfc950c3d0241f"} Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.281547 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" event={"ID":"1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2","Type":"ContainerDied","Data":"67e1b01ee217e330d96e8fdbab351fb318fb47774ac99a3d910c63aa9326c111"} Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.281564 4756 scope.go:117] "RemoveContainer" containerID="c330c07d871ed30ac915eb719f9a7f906f4f5aff091f63c035cfc950c3d0241f" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.281662 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.283865 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jmwzc" event={"ID":"c76033c1-1ccb-42ce-ade9-f46428bc0b46","Type":"ContainerStarted","Data":"25fbc5913f83b47d1ccd0fc00ffa34442cd0ee062fe9b6763d9988f46f2640bc"} Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.286522 4756 generic.go:334] "Generic (PLEG): container finished" podID="31db1d26-2d05-489e-9177-580797f8897c" containerID="3bddb0496b08fed63d21de430db606e49d46cfdd6c4819ab0775c8c6e5033594" exitCode=0 Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.286609 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" event={"ID":"31db1d26-2d05-489e-9177-580797f8897c","Type":"ContainerDied","Data":"3bddb0496b08fed63d21de430db606e49d46cfdd6c4819ab0775c8c6e5033594"} Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.293670 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.293670 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-95vm5" event={"ID":"42d7ef3d-2c3b-455c-9457-44441f1bfcff","Type":"ContainerDied","Data":"599d763dad17aa921a7c1e59f7a1aafcc5bb0c99fdc753fee9999114330b93dd"} Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.296220 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"76bea747-b62f-4749-b01a-207a594758e1","Type":"ContainerStarted","Data":"56a7a580844adfaa1e126697c2a9d5a4e0b9d76d28c9e8586216e29e8fdd542d"} Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.297731 4756 generic.go:334] "Generic (PLEG): container finished" podID="7bb3189f-716d-4fef-b885-3a031a60d981" containerID="00fab9a64615b7815712deba913c7cad59eca83b4fbd4e5ebb424ff868e4da66" exitCode=0 Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.297792 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w28cp" event={"ID":"7bb3189f-716d-4fef-b885-3a031a60d981","Type":"ContainerDied","Data":"00fab9a64615b7815712deba913c7cad59eca83b4fbd4e5ebb424ff868e4da66"} Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.297816 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w28cp" event={"ID":"7bb3189f-716d-4fef-b885-3a031a60d981","Type":"ContainerStarted","Data":"aece27eb672ef310f383f576b2abfc62b69117204b72ffe1273500fce32133f4"} Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.300107 4756 generic.go:334] "Generic (PLEG): container finished" podID="48602255-9809-498e-9c4a-6053ba5ff591" containerID="e3720792b8226fd684f2d032bcb5ba026eeb41406095bfe5fbdb8e3ca7ff5386" exitCode=0 Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.300167 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkr5r" event={"ID":"48602255-9809-498e-9c4a-6053ba5ff591","Type":"ContainerDied","Data":"e3720792b8226fd684f2d032bcb5ba026eeb41406095bfe5fbdb8e3ca7ff5386"} Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.300184 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkr5r" event={"ID":"48602255-9809-498e-9c4a-6053ba5ff591","Type":"ContainerStarted","Data":"2fe1715780a79b3516eaae22ae842cab92e5738b865a1b523a5d20ba2fbc93cf"} Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.306793 4756 scope.go:117] "RemoveContainer" containerID="c330c07d871ed30ac915eb719f9a7f906f4f5aff091f63c035cfc950c3d0241f" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.308658 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" event={"ID":"66923387-ae05-40a5-9a8b-8de577e30cb1","Type":"ContainerStarted","Data":"3e7151c048bdb50460645c9d5907ca423d82f83a7a305a60f12c19e47666f524"} Mar 18 14:03:43 crc kubenswrapper[4756]: E0318 14:03:43.309699 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c330c07d871ed30ac915eb719f9a7f906f4f5aff091f63c035cfc950c3d0241f\": container with ID starting with c330c07d871ed30ac915eb719f9a7f906f4f5aff091f63c035cfc950c3d0241f not found: ID does not exist" containerID="c330c07d871ed30ac915eb719f9a7f906f4f5aff091f63c035cfc950c3d0241f" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.309732 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c330c07d871ed30ac915eb719f9a7f906f4f5aff091f63c035cfc950c3d0241f"} err="failed to get container status \"c330c07d871ed30ac915eb719f9a7f906f4f5aff091f63c035cfc950c3d0241f\": rpc error: code = NotFound desc = could not find container \"c330c07d871ed30ac915eb719f9a7f906f4f5aff091f63c035cfc950c3d0241f\": container with ID starting with c330c07d871ed30ac915eb719f9a7f906f4f5aff091f63c035cfc950c3d0241f not found: ID does not exist" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.309753 4756 scope.go:117] "RemoveContainer" containerID="afe0d69f73678372c93451257c7d5664926ee3689192b71e6d9cbd44bd82813a" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.313270 4756 generic.go:334] "Generic (PLEG): container finished" podID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" containerID="817d71c44fbce04b94845dbc4acca2b0e2404867dd1e608dbbd975a67d91af96" exitCode=0 Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.313357 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfqzk" event={"ID":"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b","Type":"ContainerDied","Data":"817d71c44fbce04b94845dbc4acca2b0e2404867dd1e608dbbd975a67d91af96"} Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.313460 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfqzk" event={"ID":"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b","Type":"ContainerStarted","Data":"af35670de8506cd4bf15639878f63745dde2272b4ce9b2c8ec7b61efd822045e"} Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.336963 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9hrhl"] Mar 18 14:03:43 crc kubenswrapper[4756]: E0318 14:03:43.337170 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d7ef3d-2c3b-455c-9457-44441f1bfcff" containerName="controller-manager" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.337183 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d7ef3d-2c3b-455c-9457-44441f1bfcff" containerName="controller-manager" Mar 18 14:03:43 crc kubenswrapper[4756]: E0318 14:03:43.337206 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2" containerName="route-controller-manager" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.337212 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2" containerName="route-controller-manager" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.337305 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d7ef3d-2c3b-455c-9457-44441f1bfcff" containerName="controller-manager" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.337317 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2" containerName="route-controller-manager" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.338242 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hrhl"] Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.338344 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.340871 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.372172 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:43 crc kubenswrapper[4756]: E0318 14:03:43.372597 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:43.872568794 +0000 UTC m=+225.186986769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.372899 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:43 crc kubenswrapper[4756]: E0318 14:03:43.375492 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 14:03:43.875480762 +0000 UTC m=+225.189898737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9pdb4" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.399243 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v"] Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.402169 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6mb6v"] Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.422451 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-95vm5"] Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.428072 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-95vm5"] Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.479197 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.479951 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caee1439-b7bb-456e-982f-1c3c3cdb51c3-catalog-content\") pod \"redhat-marketplace-9hrhl\" (UID: \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\") " pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.480043 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caee1439-b7bb-456e-982f-1c3c3cdb51c3-utilities\") pod \"redhat-marketplace-9hrhl\" (UID: \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\") " pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.480161 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56d8b\" (UniqueName: \"kubernetes.io/projected/caee1439-b7bb-456e-982f-1c3c3cdb51c3-kube-api-access-56d8b\") pod \"redhat-marketplace-9hrhl\" (UID: \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\") " pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:03:43 crc kubenswrapper[4756]: E0318 14:03:43.481508 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 14:03:43.981469046 +0000 UTC m=+225.295887041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.516408 4756 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T14:03:42.721244617Z","Handler":null,"Name":""} Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.555158 4756 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.555214 4756 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.570978 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp"] Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.572367 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.575631 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.575852 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.576017 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.576195 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.576367 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.576486 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.581371 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56d8b\" (UniqueName: \"kubernetes.io/projected/caee1439-b7bb-456e-982f-1c3c3cdb51c3-kube-api-access-56d8b\") pod \"redhat-marketplace-9hrhl\" (UID: \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\") " pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.581436 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.581502 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caee1439-b7bb-456e-982f-1c3c3cdb51c3-catalog-content\") pod \"redhat-marketplace-9hrhl\" (UID: \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\") " pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.581548 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caee1439-b7bb-456e-982f-1c3c3cdb51c3-utilities\") pod \"redhat-marketplace-9hrhl\" (UID: \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\") " pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.582655 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caee1439-b7bb-456e-982f-1c3c3cdb51c3-utilities\") pod \"redhat-marketplace-9hrhl\" (UID: \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\") " pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.583620 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caee1439-b7bb-456e-982f-1c3c3cdb51c3-catalog-content\") pod \"redhat-marketplace-9hrhl\" (UID: \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\") " pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.584266 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8f976fbbb-kktkr"] Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.585105 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.589077 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp"] Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.591911 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8f976fbbb-kktkr"] Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.596014 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.600138 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.601266 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.601576 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.601818 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.602219 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.602355 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.609290 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56d8b\" (UniqueName: \"kubernetes.io/projected/caee1439-b7bb-456e-982f-1c3c3cdb51c3-kube-api-access-56d8b\") pod \"redhat-marketplace-9hrhl\" (UID: \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\") " pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.629079 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.629512 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.637092 4756 ???:1] "http: TLS handshake error from 192.168.126.11:47260: no serving certificate available for the kubelet" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.650899 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.657425 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9pdb4\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.682975 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.684242 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/258e7cae-bb8d-43cb-afc6-a133c8a38678-serving-cert\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.684382 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84c694d6-e399-4e86-b988-074eb76dd7c6-serving-cert\") pod \"route-controller-manager-5c978874dc-cllkp\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.684472 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-config\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.684574 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84c694d6-e399-4e86-b988-074eb76dd7c6-config\") pod \"route-controller-manager-5c978874dc-cllkp\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.684610 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-client-ca\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.684684 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-proxy-ca-bundles\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.684727 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfklh\" (UniqueName: \"kubernetes.io/projected/258e7cae-bb8d-43cb-afc6-a133c8a38678-kube-api-access-jfklh\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.684759 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84c694d6-e399-4e86-b988-074eb76dd7c6-client-ca\") pod \"route-controller-manager-5c978874dc-cllkp\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.684790 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8pgx\" (UniqueName: \"kubernetes.io/projected/84c694d6-e399-4e86-b988-074eb76dd7c6-kube-api-access-q8pgx\") pod \"route-controller-manager-5c978874dc-cllkp\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.695759 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.726987 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dhpzd"] Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.728681 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.735159 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhpzd"] Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.787041 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8pgx\" (UniqueName: \"kubernetes.io/projected/84c694d6-e399-4e86-b988-074eb76dd7c6-kube-api-access-q8pgx\") pod \"route-controller-manager-5c978874dc-cllkp\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.787170 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/258e7cae-bb8d-43cb-afc6-a133c8a38678-serving-cert\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.787196 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a2bca-c232-4242-a46c-31630b85585d-utilities\") pod \"redhat-marketplace-dhpzd\" (UID: \"604a2bca-c232-4242-a46c-31630b85585d\") " pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.787224 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84c694d6-e399-4e86-b988-074eb76dd7c6-serving-cert\") pod \"route-controller-manager-5c978874dc-cllkp\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.787257 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-config\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.787281 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84c694d6-e399-4e86-b988-074eb76dd7c6-config\") pod \"route-controller-manager-5c978874dc-cllkp\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.787298 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-client-ca\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.787316 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlvf7\" (UniqueName: \"kubernetes.io/projected/604a2bca-c232-4242-a46c-31630b85585d-kube-api-access-xlvf7\") pod \"redhat-marketplace-dhpzd\" (UID: \"604a2bca-c232-4242-a46c-31630b85585d\") " pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.787333 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a2bca-c232-4242-a46c-31630b85585d-catalog-content\") pod \"redhat-marketplace-dhpzd\" (UID: \"604a2bca-c232-4242-a46c-31630b85585d\") " pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.787358 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-proxy-ca-bundles\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.787382 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfklh\" (UniqueName: \"kubernetes.io/projected/258e7cae-bb8d-43cb-afc6-a133c8a38678-kube-api-access-jfklh\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.787401 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84c694d6-e399-4e86-b988-074eb76dd7c6-client-ca\") pod \"route-controller-manager-5c978874dc-cllkp\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.788400 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84c694d6-e399-4e86-b988-074eb76dd7c6-client-ca\") pod \"route-controller-manager-5c978874dc-cllkp\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.792717 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-client-ca\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.793229 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-proxy-ca-bundles\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.794936 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84c694d6-e399-4e86-b988-074eb76dd7c6-config\") pod \"route-controller-manager-5c978874dc-cllkp\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.795418 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/258e7cae-bb8d-43cb-afc6-a133c8a38678-serving-cert\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.796047 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-config\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.798663 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84c694d6-e399-4e86-b988-074eb76dd7c6-serving-cert\") pod \"route-controller-manager-5c978874dc-cllkp\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.809161 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8pgx\" (UniqueName: \"kubernetes.io/projected/84c694d6-e399-4e86-b988-074eb76dd7c6-kube-api-access-q8pgx\") pod \"route-controller-manager-5c978874dc-cllkp\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.814131 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfklh\" (UniqueName: \"kubernetes.io/projected/258e7cae-bb8d-43cb-afc6-a133c8a38678-kube-api-access-jfklh\") pod \"controller-manager-8f976fbbb-kktkr\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.892703 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlvf7\" (UniqueName: \"kubernetes.io/projected/604a2bca-c232-4242-a46c-31630b85585d-kube-api-access-xlvf7\") pod \"redhat-marketplace-dhpzd\" (UID: \"604a2bca-c232-4242-a46c-31630b85585d\") " pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.892747 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a2bca-c232-4242-a46c-31630b85585d-catalog-content\") pod \"redhat-marketplace-dhpzd\" (UID: \"604a2bca-c232-4242-a46c-31630b85585d\") " pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.893143 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a2bca-c232-4242-a46c-31630b85585d-utilities\") pod \"redhat-marketplace-dhpzd\" (UID: \"604a2bca-c232-4242-a46c-31630b85585d\") " pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.893516 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a2bca-c232-4242-a46c-31630b85585d-utilities\") pod \"redhat-marketplace-dhpzd\" (UID: \"604a2bca-c232-4242-a46c-31630b85585d\") " pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.893642 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a2bca-c232-4242-a46c-31630b85585d-catalog-content\") pod \"redhat-marketplace-dhpzd\" (UID: \"604a2bca-c232-4242-a46c-31630b85585d\") " pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.900803 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.909570 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hrhl"] Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.911529 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlvf7\" (UniqueName: \"kubernetes.io/projected/604a2bca-c232-4242-a46c-31630b85585d-kube-api-access-xlvf7\") pod \"redhat-marketplace-dhpzd\" (UID: \"604a2bca-c232-4242-a46c-31630b85585d\") " pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.923633 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:43 crc kubenswrapper[4756]: I0318 14:03:43.928858 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.052666 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.200819 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:44 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:44 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:44 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.201658 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.229862 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9pdb4"] Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.281094 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8f976fbbb-kktkr"] Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.321384 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhpzd"] Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.336674 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gs6l5"] Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.338784 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.340617 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.346594 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gs6l5"] Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.349815 4756 generic.go:334] "Generic (PLEG): container finished" podID="76bea747-b62f-4749-b01a-207a594758e1" containerID="148028d8e741e19865cf3f27e3eb9745e3a93ef6ba4428af1c719b4637716d0d" exitCode=0 Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.350628 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"76bea747-b62f-4749-b01a-207a594758e1","Type":"ContainerDied","Data":"148028d8e741e19865cf3f27e3eb9745e3a93ef6ba4428af1c719b4637716d0d"} Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.367909 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" event={"ID":"a15d24cf-4182-44bb-9d60-33649137cc83","Type":"ContainerStarted","Data":"eefa0a132bc5d97acb29465466a79a2100c2aebadfc104b20f5ec1adbb109e32"} Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.373381 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp"] Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.378687 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" event={"ID":"258e7cae-bb8d-43cb-afc6-a133c8a38678","Type":"ContainerStarted","Data":"55681ec73f5decd825a5ed7a9d69457d63e62fb031d954eab1ca02952b3adc1e"} Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.380951 4756 generic.go:334] "Generic (PLEG): container finished" podID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" containerID="ff1c83ffdece8817475d2b00bc4d5840cb8ebe317ec27c721e3a19407dc3f3cf" exitCode=0 Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.381031 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jmwzc" event={"ID":"c76033c1-1ccb-42ce-ade9-f46428bc0b46","Type":"ContainerDied","Data":"ff1c83ffdece8817475d2b00bc4d5840cb8ebe317ec27c721e3a19407dc3f3cf"} Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.409102 4756 generic.go:334] "Generic (PLEG): container finished" podID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" containerID="1d71cf9b79bb64fb6ef9118e775f87a02aa070d6dd8aa6c974cd903eaed6c66e" exitCode=0 Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.409235 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hrhl" event={"ID":"caee1439-b7bb-456e-982f-1c3c3cdb51c3","Type":"ContainerDied","Data":"1d71cf9b79bb64fb6ef9118e775f87a02aa070d6dd8aa6c974cd903eaed6c66e"} Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.409270 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hrhl" event={"ID":"caee1439-b7bb-456e-982f-1c3c3cdb51c3","Type":"ContainerStarted","Data":"cfa3784752dbd4524d3b17b32e4236210fa92fb35e5cfc9795621d27bd778db2"} Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.412479 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af462049-61c3-4da5-aeb0-0311404c4741-utilities\") pod \"redhat-operators-gs6l5\" (UID: \"af462049-61c3-4da5-aeb0-0311404c4741\") " pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.412514 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af462049-61c3-4da5-aeb0-0311404c4741-catalog-content\") pod \"redhat-operators-gs6l5\" (UID: \"af462049-61c3-4da5-aeb0-0311404c4741\") " pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.412566 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jcxq\" (UniqueName: \"kubernetes.io/projected/af462049-61c3-4da5-aeb0-0311404c4741-kube-api-access-8jcxq\") pod \"redhat-operators-gs6l5\" (UID: \"af462049-61c3-4da5-aeb0-0311404c4741\") " pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.419162 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" event={"ID":"66923387-ae05-40a5-9a8b-8de577e30cb1","Type":"ContainerStarted","Data":"39f12c0df7cca7802d516d0f642d56a969bc72a272eb52bec31a0238acea6804"} Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.452170 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-p9mcq" podStartSLOduration=12.452152594 podStartE2EDuration="12.452152594s" podCreationTimestamp="2026-03-18 14:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:44.447475139 +0000 UTC m=+225.761893114" watchObservedRunningTime="2026-03-18 14:03:44.452152594 +0000 UTC m=+225.766570579" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.514075 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af462049-61c3-4da5-aeb0-0311404c4741-utilities\") pod \"redhat-operators-gs6l5\" (UID: \"af462049-61c3-4da5-aeb0-0311404c4741\") " pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.514127 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af462049-61c3-4da5-aeb0-0311404c4741-catalog-content\") pod \"redhat-operators-gs6l5\" (UID: \"af462049-61c3-4da5-aeb0-0311404c4741\") " pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.514165 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jcxq\" (UniqueName: \"kubernetes.io/projected/af462049-61c3-4da5-aeb0-0311404c4741-kube-api-access-8jcxq\") pod \"redhat-operators-gs6l5\" (UID: \"af462049-61c3-4da5-aeb0-0311404c4741\") " pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.515199 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af462049-61c3-4da5-aeb0-0311404c4741-utilities\") pod \"redhat-operators-gs6l5\" (UID: \"af462049-61c3-4da5-aeb0-0311404c4741\") " pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.515233 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af462049-61c3-4da5-aeb0-0311404c4741-catalog-content\") pod \"redhat-operators-gs6l5\" (UID: \"af462049-61c3-4da5-aeb0-0311404c4741\") " pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.538368 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jcxq\" (UniqueName: \"kubernetes.io/projected/af462049-61c3-4da5-aeb0-0311404c4741-kube-api-access-8jcxq\") pod \"redhat-operators-gs6l5\" (UID: \"af462049-61c3-4da5-aeb0-0311404c4741\") " pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.618965 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kgtbj" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.669679 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.707323 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.717986 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cn8ht"] Mar 18 14:03:44 crc kubenswrapper[4756]: E0318 14:03:44.718603 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31db1d26-2d05-489e-9177-580797f8897c" containerName="collect-profiles" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.718672 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="31db1d26-2d05-489e-9177-580797f8897c" containerName="collect-profiles" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.718840 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="31db1d26-2d05-489e-9177-580797f8897c" containerName="collect-profiles" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.719722 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.727152 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cn8ht"] Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.819593 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31db1d26-2d05-489e-9177-580797f8897c-config-volume\") pod \"31db1d26-2d05-489e-9177-580797f8897c\" (UID: \"31db1d26-2d05-489e-9177-580797f8897c\") " Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.819685 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/31db1d26-2d05-489e-9177-580797f8897c-secret-volume\") pod \"31db1d26-2d05-489e-9177-580797f8897c\" (UID: \"31db1d26-2d05-489e-9177-580797f8897c\") " Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.819719 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v65kz\" (UniqueName: \"kubernetes.io/projected/31db1d26-2d05-489e-9177-580797f8897c-kube-api-access-v65kz\") pod \"31db1d26-2d05-489e-9177-580797f8897c\" (UID: \"31db1d26-2d05-489e-9177-580797f8897c\") " Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.819991 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-catalog-content\") pod \"redhat-operators-cn8ht\" (UID: \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\") " pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.820037 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt6kl\" (UniqueName: \"kubernetes.io/projected/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-kube-api-access-pt6kl\") pod \"redhat-operators-cn8ht\" (UID: \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\") " pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.820067 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-utilities\") pod \"redhat-operators-cn8ht\" (UID: \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\") " pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.821529 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31db1d26-2d05-489e-9177-580797f8897c-config-volume" (OuterVolumeSpecName: "config-volume") pod "31db1d26-2d05-489e-9177-580797f8897c" (UID: "31db1d26-2d05-489e-9177-580797f8897c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.831867 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31db1d26-2d05-489e-9177-580797f8897c-kube-api-access-v65kz" (OuterVolumeSpecName: "kube-api-access-v65kz") pod "31db1d26-2d05-489e-9177-580797f8897c" (UID: "31db1d26-2d05-489e-9177-580797f8897c"). InnerVolumeSpecName "kube-api-access-v65kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.832284 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31db1d26-2d05-489e-9177-580797f8897c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "31db1d26-2d05-489e-9177-580797f8897c" (UID: "31db1d26-2d05-489e-9177-580797f8897c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.911006 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.921009 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-utilities\") pod \"redhat-operators-cn8ht\" (UID: \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\") " pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.921477 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-utilities\") pod \"redhat-operators-cn8ht\" (UID: \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\") " pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.921596 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-catalog-content\") pod \"redhat-operators-cn8ht\" (UID: \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\") " pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.921642 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt6kl\" (UniqueName: \"kubernetes.io/projected/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-kube-api-access-pt6kl\") pod \"redhat-operators-cn8ht\" (UID: \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\") " pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.921713 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/31db1d26-2d05-489e-9177-580797f8897c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.921725 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v65kz\" (UniqueName: \"kubernetes.io/projected/31db1d26-2d05-489e-9177-580797f8897c-kube-api-access-v65kz\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.921733 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31db1d26-2d05-489e-9177-580797f8897c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.922047 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-catalog-content\") pod \"redhat-operators-cn8ht\" (UID: \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\") " pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.923403 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.935511 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.950136 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt6kl\" (UniqueName: \"kubernetes.io/projected/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-kube-api-access-pt6kl\") pod \"redhat-operators-cn8ht\" (UID: \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\") " pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.953271 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j87k5" Mar 18 14:03:44 crc kubenswrapper[4756]: I0318 14:03:44.959103 4756 ???:1] "http: TLS handshake error from 192.168.126.11:47276: no serving certificate available for the kubelet" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.048234 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.197367 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.201605 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:45 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:45 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:45 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.201654 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.210302 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gs6l5"] Mar 18 14:03:45 crc kubenswrapper[4756]: W0318 14:03:45.286401 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf462049_61c3_4da5_aeb0_0311404c4741.slice/crio-d5029652368aa8e800239e9ab67e0977bd39fe8cd70d1d8ae729a230276c88ce WatchSource:0}: Error finding container d5029652368aa8e800239e9ab67e0977bd39fe8cd70d1d8ae729a230276c88ce: Status 404 returned error can't find the container with id d5029652368aa8e800239e9ab67e0977bd39fe8cd70d1d8ae729a230276c88ce Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.339297 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2" path="/var/lib/kubelet/pods/1bf3630a-c25d-495d-b0c7-5b9c7a8fd8a2/volumes" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.340020 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d7ef3d-2c3b-455c-9457-44441f1bfcff" path="/var/lib/kubelet/pods/42d7ef3d-2c3b-455c-9457-44441f1bfcff/volumes" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.340659 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.341212 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cn8ht"] Mar 18 14:03:45 crc kubenswrapper[4756]: W0318 14:03:45.360590 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a6ae3a_3fd6_4254_9ca5_8654eee53b55.slice/crio-f131c64ccd8383e1644568a192914c0c825e5cd2213507c4eee948653ab6d239 WatchSource:0}: Error finding container f131c64ccd8383e1644568a192914c0c825e5cd2213507c4eee948653ab6d239: Status 404 returned error can't find the container with id f131c64ccd8383e1644568a192914c0c825e5cd2213507c4eee948653ab6d239 Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.448002 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" event={"ID":"84c694d6-e399-4e86-b988-074eb76dd7c6","Type":"ContainerStarted","Data":"79f141a1e9c74017c1590e25b98f1a1968d9b47a69f4f37cd56d6c4646b21d71"} Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.448049 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" event={"ID":"84c694d6-e399-4e86-b988-074eb76dd7c6","Type":"ContainerStarted","Data":"43c57c24887feada5ccb22af1703f908d45d663ca5c42c39a08ee777fe443ed7"} Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.448537 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.456520 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.456804 4756 generic.go:334] "Generic (PLEG): container finished" podID="604a2bca-c232-4242-a46c-31630b85585d" containerID="bbd90c10f3d1f4bb42cacc760659b0825df4bb8e1b58616fc88f4e3eff40dd95" exitCode=0 Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.456875 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhpzd" event={"ID":"604a2bca-c232-4242-a46c-31630b85585d","Type":"ContainerDied","Data":"bbd90c10f3d1f4bb42cacc760659b0825df4bb8e1b58616fc88f4e3eff40dd95"} Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.456900 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhpzd" event={"ID":"604a2bca-c232-4242-a46c-31630b85585d","Type":"ContainerStarted","Data":"99232e276a56d718927bb756b17f640904c229559b9bedc1aacd60bbdc8f251b"} Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.470090 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" podStartSLOduration=4.470070079 podStartE2EDuration="4.470070079s" podCreationTimestamp="2026-03-18 14:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:45.462531616 +0000 UTC m=+226.776949611" watchObservedRunningTime="2026-03-18 14:03:45.470070079 +0000 UTC m=+226.784488064" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.474264 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" event={"ID":"31db1d26-2d05-489e-9177-580797f8897c","Type":"ContainerDied","Data":"67546d7535e9cf4202b1b853b76e2d9d0ec55d4ae9e108cafc25da469f6e947e"} Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.474300 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67546d7535e9cf4202b1b853b76e2d9d0ec55d4ae9e108cafc25da469f6e947e" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.474766 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.486164 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" event={"ID":"a15d24cf-4182-44bb-9d60-33649137cc83","Type":"ContainerStarted","Data":"e24d33b52f39665a23cc6883f6a87c34b66ef6825cf95aca6a40e45f4b59bca2"} Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.486907 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.501157 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs6l5" event={"ID":"af462049-61c3-4da5-aeb0-0311404c4741","Type":"ContainerStarted","Data":"d5029652368aa8e800239e9ab67e0977bd39fe8cd70d1d8ae729a230276c88ce"} Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.519054 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn8ht" event={"ID":"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55","Type":"ContainerStarted","Data":"f131c64ccd8383e1644568a192914c0c825e5cd2213507c4eee948653ab6d239"} Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.525401 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.535052 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.538052 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" event={"ID":"258e7cae-bb8d-43cb-afc6-a133c8a38678","Type":"ContainerStarted","Data":"f5d0742169a5e3bf3f6aed116b7b6e7a8088e10c7415302841c779087a57d61c"} Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.538080 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.543454 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" podStartSLOduration=169.543395106 podStartE2EDuration="2m49.543395106s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:45.541403634 +0000 UTC m=+226.855821619" watchObservedRunningTime="2026-03-18 14:03:45.543395106 +0000 UTC m=+226.857813091" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.543767 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.544170 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.556237 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.566174 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.566449 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pvzfk" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.569273 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" podStartSLOduration=4.569253371 podStartE2EDuration="4.569253371s" podCreationTimestamp="2026-03-18 14:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:45.567194435 +0000 UTC m=+226.881612420" watchObservedRunningTime="2026-03-18 14:03:45.569253371 +0000 UTC m=+226.883671346" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.631771 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.631824 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.636819 4756 patch_prober.go:28] interesting pod/console-f9d7485db-k5xg9 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.636903 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-k5xg9" podUID="f88a8bdd-954f-455c-aad1-03b1988afa37" containerName="console" probeResult="failure" output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.642208 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb254a39-6093-4d92-aa6e-f3471c8bfcb9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bb254a39-6093-4d92-aa6e-f3471c8bfcb9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.642384 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb254a39-6093-4d92-aa6e-f3471c8bfcb9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bb254a39-6093-4d92-aa6e-f3471c8bfcb9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.743585 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb254a39-6093-4d92-aa6e-f3471c8bfcb9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bb254a39-6093-4d92-aa6e-f3471c8bfcb9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.743726 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb254a39-6093-4d92-aa6e-f3471c8bfcb9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bb254a39-6093-4d92-aa6e-f3471c8bfcb9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.744188 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb254a39-6093-4d92-aa6e-f3471c8bfcb9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bb254a39-6093-4d92-aa6e-f3471c8bfcb9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.770581 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb254a39-6093-4d92-aa6e-f3471c8bfcb9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bb254a39-6093-4d92-aa6e-f3471c8bfcb9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.857663 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.890324 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.949306 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76bea747-b62f-4749-b01a-207a594758e1-kube-api-access\") pod \"76bea747-b62f-4749-b01a-207a594758e1\" (UID: \"76bea747-b62f-4749-b01a-207a594758e1\") " Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.949394 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76bea747-b62f-4749-b01a-207a594758e1-kubelet-dir\") pod \"76bea747-b62f-4749-b01a-207a594758e1\" (UID: \"76bea747-b62f-4749-b01a-207a594758e1\") " Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.949773 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76bea747-b62f-4749-b01a-207a594758e1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "76bea747-b62f-4749-b01a-207a594758e1" (UID: "76bea747-b62f-4749-b01a-207a594758e1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.949925 4756 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76bea747-b62f-4749-b01a-207a594758e1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:45 crc kubenswrapper[4756]: I0318 14:03:45.953847 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76bea747-b62f-4749-b01a-207a594758e1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "76bea747-b62f-4749-b01a-207a594758e1" (UID: "76bea747-b62f-4749-b01a-207a594758e1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.009468 4756 patch_prober.go:28] interesting pod/downloads-7954f5f757-dtbtw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.009584 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dtbtw" podUID="32ba09f4-59d7-469c-a882-5564e653e868" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.010578 4756 patch_prober.go:28] interesting pod/downloads-7954f5f757-dtbtw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.010645 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dtbtw" podUID="32ba09f4-59d7-469c-a882-5564e653e868" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.050789 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76bea747-b62f-4749-b01a-207a594758e1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.200748 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:46 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:46 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:46 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.201076 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.426671 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.564309 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"76bea747-b62f-4749-b01a-207a594758e1","Type":"ContainerDied","Data":"56a7a580844adfaa1e126697c2a9d5a4e0b9d76d28c9e8586216e29e8fdd542d"} Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.564344 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56a7a580844adfaa1e126697c2a9d5a4e0b9d76d28c9e8586216e29e8fdd542d" Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.564352 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.573492 4756 generic.go:334] "Generic (PLEG): container finished" podID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" containerID="ac26d600defb2938d1ab4c3bc8730842bb10051bb0050048102e489bcddf7cea" exitCode=0 Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.573564 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn8ht" event={"ID":"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55","Type":"ContainerDied","Data":"ac26d600defb2938d1ab4c3bc8730842bb10051bb0050048102e489bcddf7cea"} Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.583051 4756 generic.go:334] "Generic (PLEG): container finished" podID="af462049-61c3-4da5-aeb0-0311404c4741" containerID="d00bc90cc9e2d8f822d27b8a60b1ff0eba4838cd63baef99a62c6e3551a382dd" exitCode=0 Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.583103 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs6l5" event={"ID":"af462049-61c3-4da5-aeb0-0311404c4741","Type":"ContainerDied","Data":"d00bc90cc9e2d8f822d27b8a60b1ff0eba4838cd63baef99a62c6e3551a382dd"} Mar 18 14:03:46 crc kubenswrapper[4756]: I0318 14:03:46.589402 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bb254a39-6093-4d92-aa6e-f3471c8bfcb9","Type":"ContainerStarted","Data":"a4586e4a3368adb4aa1485a875bc04a7fba2b1e52109abb4a34328a5338aa417"} Mar 18 14:03:47 crc kubenswrapper[4756]: I0318 14:03:47.200187 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:47 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:47 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:47 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:47 crc kubenswrapper[4756]: I0318 14:03:47.200691 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:47 crc kubenswrapper[4756]: I0318 14:03:47.553821 4756 ???:1] "http: TLS handshake error from 192.168.126.11:50290: no serving certificate available for the kubelet" Mar 18 14:03:47 crc kubenswrapper[4756]: I0318 14:03:47.600483 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bb254a39-6093-4d92-aa6e-f3471c8bfcb9","Type":"ContainerStarted","Data":"7d70ad31289d394501676f38c02ad0530eb9adb265e67bda96f74e66b6081e0a"} Mar 18 14:03:47 crc kubenswrapper[4756]: I0318 14:03:47.619774 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.619755924 podStartE2EDuration="2.619755924s" podCreationTimestamp="2026-03-18 14:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:03:47.613095465 +0000 UTC m=+228.927513440" watchObservedRunningTime="2026-03-18 14:03:47.619755924 +0000 UTC m=+228.934173889" Mar 18 14:03:48 crc kubenswrapper[4756]: I0318 14:03:48.199067 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:48 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:48 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:48 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:48 crc kubenswrapper[4756]: I0318 14:03:48.199389 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:48 crc kubenswrapper[4756]: I0318 14:03:48.612911 4756 generic.go:334] "Generic (PLEG): container finished" podID="bb254a39-6093-4d92-aa6e-f3471c8bfcb9" containerID="7d70ad31289d394501676f38c02ad0530eb9adb265e67bda96f74e66b6081e0a" exitCode=0 Mar 18 14:03:48 crc kubenswrapper[4756]: I0318 14:03:48.612957 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bb254a39-6093-4d92-aa6e-f3471c8bfcb9","Type":"ContainerDied","Data":"7d70ad31289d394501676f38c02ad0530eb9adb265e67bda96f74e66b6081e0a"} Mar 18 14:03:48 crc kubenswrapper[4756]: I0318 14:03:48.824143 4756 ???:1] "http: TLS handshake error from 192.168.126.11:50302: no serving certificate available for the kubelet" Mar 18 14:03:49 crc kubenswrapper[4756]: I0318 14:03:49.198512 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:49 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:49 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:49 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:49 crc kubenswrapper[4756]: I0318 14:03:49.198559 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:50 crc kubenswrapper[4756]: I0318 14:03:50.199620 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:50 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:50 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:50 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:50 crc kubenswrapper[4756]: I0318 14:03:50.200012 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:51 crc kubenswrapper[4756]: I0318 14:03:51.050792 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vxnd9" Mar 18 14:03:51 crc kubenswrapper[4756]: I0318 14:03:51.199473 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:51 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:51 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:51 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:51 crc kubenswrapper[4756]: I0318 14:03:51.199520 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:52 crc kubenswrapper[4756]: I0318 14:03:52.199847 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:52 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:52 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:52 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:52 crc kubenswrapper[4756]: I0318 14:03:52.199931 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:52 crc kubenswrapper[4756]: I0318 14:03:52.701767 4756 ???:1] "http: TLS handshake error from 192.168.126.11:50314: no serving certificate available for the kubelet" Mar 18 14:03:53 crc kubenswrapper[4756]: I0318 14:03:53.198984 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:53 crc kubenswrapper[4756]: [-]has-synced failed: reason withheld Mar 18 14:03:53 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:53 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:53 crc kubenswrapper[4756]: I0318 14:03:53.199043 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:54 crc kubenswrapper[4756]: I0318 14:03:54.198769 4756 patch_prober.go:28] interesting pod/router-default-5444994796-f9hzx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 14:03:54 crc kubenswrapper[4756]: [+]has-synced ok Mar 18 14:03:54 crc kubenswrapper[4756]: [+]process-running ok Mar 18 14:03:54 crc kubenswrapper[4756]: healthz check failed Mar 18 14:03:54 crc kubenswrapper[4756]: I0318 14:03:54.199233 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f9hzx" podUID="d199014d-5b92-48d1-966d-30af0da2e1c2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 14:03:55 crc kubenswrapper[4756]: I0318 14:03:55.199580 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:55 crc kubenswrapper[4756]: I0318 14:03:55.201988 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-f9hzx" Mar 18 14:03:55 crc kubenswrapper[4756]: I0318 14:03:55.678014 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:55 crc kubenswrapper[4756]: I0318 14:03:55.683076 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:03:56 crc kubenswrapper[4756]: I0318 14:03:56.013480 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dtbtw" Mar 18 14:03:57 crc kubenswrapper[4756]: I0318 14:03:57.708887 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:57 crc kubenswrapper[4756]: I0318 14:03:57.721414 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11d2088-741c-4812-8eb2-ccfc3d0c7d11-metrics-certs\") pod \"network-metrics-daemon-gfdtl\" (UID: \"c11d2088-741c-4812-8eb2-ccfc3d0c7d11\") " pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:03:57 crc kubenswrapper[4756]: I0318 14:03:57.737322 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gfdtl" Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.140075 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564044-fs4jl"] Mar 18 14:04:00 crc kubenswrapper[4756]: E0318 14:04:00.140646 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76bea747-b62f-4749-b01a-207a594758e1" containerName="pruner" Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.140660 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="76bea747-b62f-4749-b01a-207a594758e1" containerName="pruner" Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.140817 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="76bea747-b62f-4749-b01a-207a594758e1" containerName="pruner" Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.141311 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564044-fs4jl" Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.143000 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.149424 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564044-fs4jl"] Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.239658 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdmt6\" (UniqueName: \"kubernetes.io/projected/0f367985-a362-46d8-8dab-205cb7756e9e-kube-api-access-fdmt6\") pod \"auto-csr-approver-29564044-fs4jl\" (UID: \"0f367985-a362-46d8-8dab-205cb7756e9e\") " pod="openshift-infra/auto-csr-approver-29564044-fs4jl" Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.341438 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdmt6\" (UniqueName: \"kubernetes.io/projected/0f367985-a362-46d8-8dab-205cb7756e9e-kube-api-access-fdmt6\") pod \"auto-csr-approver-29564044-fs4jl\" (UID: \"0f367985-a362-46d8-8dab-205cb7756e9e\") " pod="openshift-infra/auto-csr-approver-29564044-fs4jl" Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.365417 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdmt6\" (UniqueName: \"kubernetes.io/projected/0f367985-a362-46d8-8dab-205cb7756e9e-kube-api-access-fdmt6\") pod \"auto-csr-approver-29564044-fs4jl\" (UID: \"0f367985-a362-46d8-8dab-205cb7756e9e\") " pod="openshift-infra/auto-csr-approver-29564044-fs4jl" Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.459234 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564044-fs4jl" Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.567313 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8f976fbbb-kktkr"] Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.567514 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" podUID="258e7cae-bb8d-43cb-afc6-a133c8a38678" containerName="controller-manager" containerID="cri-o://f5d0742169a5e3bf3f6aed116b7b6e7a8088e10c7415302841c779087a57d61c" gracePeriod=30 Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.588463 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp"] Mar 18 14:04:00 crc kubenswrapper[4756]: I0318 14:04:00.588786 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" podUID="84c694d6-e399-4e86-b988-074eb76dd7c6" containerName="route-controller-manager" containerID="cri-o://79f141a1e9c74017c1590e25b98f1a1968d9b47a69f4f37cd56d6c4646b21d71" gracePeriod=30 Mar 18 14:04:01 crc kubenswrapper[4756]: I0318 14:04:01.709327 4756 generic.go:334] "Generic (PLEG): container finished" podID="258e7cae-bb8d-43cb-afc6-a133c8a38678" containerID="f5d0742169a5e3bf3f6aed116b7b6e7a8088e10c7415302841c779087a57d61c" exitCode=0 Mar 18 14:04:01 crc kubenswrapper[4756]: I0318 14:04:01.709437 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" event={"ID":"258e7cae-bb8d-43cb-afc6-a133c8a38678","Type":"ContainerDied","Data":"f5d0742169a5e3bf3f6aed116b7b6e7a8088e10c7415302841c779087a57d61c"} Mar 18 14:04:01 crc kubenswrapper[4756]: I0318 14:04:01.712699 4756 generic.go:334] "Generic (PLEG): container finished" podID="84c694d6-e399-4e86-b988-074eb76dd7c6" containerID="79f141a1e9c74017c1590e25b98f1a1968d9b47a69f4f37cd56d6c4646b21d71" exitCode=0 Mar 18 14:04:01 crc kubenswrapper[4756]: I0318 14:04:01.712755 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" event={"ID":"84c694d6-e399-4e86-b988-074eb76dd7c6","Type":"ContainerDied","Data":"79f141a1e9c74017c1590e25b98f1a1968d9b47a69f4f37cd56d6c4646b21d71"} Mar 18 14:04:01 crc kubenswrapper[4756]: I0318 14:04:01.856229 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 14:04:01 crc kubenswrapper[4756]: I0318 14:04:01.964186 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb254a39-6093-4d92-aa6e-f3471c8bfcb9-kube-api-access\") pod \"bb254a39-6093-4d92-aa6e-f3471c8bfcb9\" (UID: \"bb254a39-6093-4d92-aa6e-f3471c8bfcb9\") " Mar 18 14:04:01 crc kubenswrapper[4756]: I0318 14:04:01.964358 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb254a39-6093-4d92-aa6e-f3471c8bfcb9-kubelet-dir\") pod \"bb254a39-6093-4d92-aa6e-f3471c8bfcb9\" (UID: \"bb254a39-6093-4d92-aa6e-f3471c8bfcb9\") " Mar 18 14:04:01 crc kubenswrapper[4756]: I0318 14:04:01.964456 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb254a39-6093-4d92-aa6e-f3471c8bfcb9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bb254a39-6093-4d92-aa6e-f3471c8bfcb9" (UID: "bb254a39-6093-4d92-aa6e-f3471c8bfcb9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:04:01 crc kubenswrapper[4756]: I0318 14:04:01.964839 4756 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb254a39-6093-4d92-aa6e-f3471c8bfcb9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:01 crc kubenswrapper[4756]: I0318 14:04:01.973817 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb254a39-6093-4d92-aa6e-f3471c8bfcb9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bb254a39-6093-4d92-aa6e-f3471c8bfcb9" (UID: "bb254a39-6093-4d92-aa6e-f3471c8bfcb9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:02 crc kubenswrapper[4756]: I0318 14:04:02.066093 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb254a39-6093-4d92-aa6e-f3471c8bfcb9-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:02 crc kubenswrapper[4756]: I0318 14:04:02.720969 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bb254a39-6093-4d92-aa6e-f3471c8bfcb9","Type":"ContainerDied","Data":"a4586e4a3368adb4aa1485a875bc04a7fba2b1e52109abb4a34328a5338aa417"} Mar 18 14:04:02 crc kubenswrapper[4756]: I0318 14:04:02.721008 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4586e4a3368adb4aa1485a875bc04a7fba2b1e52109abb4a34328a5338aa417" Mar 18 14:04:02 crc kubenswrapper[4756]: I0318 14:04:02.721024 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 14:04:03 crc kubenswrapper[4756]: E0318 14:04:03.766822 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 14:04:03 crc kubenswrapper[4756]: E0318 14:04:03.767073 4756 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 14:04:03 crc kubenswrapper[4756]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 14:04:03 crc kubenswrapper[4756]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q42gv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29564042-6mtlx_openshift-infra(bfa461e5-a4e9-4cfa-a279-df6d4a56c973): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 14:04:03 crc kubenswrapper[4756]: > logger="UnhandledError" Mar 18 14:04:03 crc kubenswrapper[4756]: E0318 14:04:03.768206 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29564042-6mtlx" podUID="bfa461e5-a4e9-4cfa-a279-df6d4a56c973" Mar 18 14:04:03 crc kubenswrapper[4756]: I0318 14:04:03.932363 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:04:04 crc kubenswrapper[4756]: E0318 14:04:04.734793 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29564042-6mtlx" podUID="bfa461e5-a4e9-4cfa-a279-df6d4a56c973" Mar 18 14:04:04 crc kubenswrapper[4756]: I0318 14:04:04.902228 4756 patch_prober.go:28] interesting pod/route-controller-manager-5c978874dc-cllkp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:04:04 crc kubenswrapper[4756]: I0318 14:04:04.902287 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" podUID="84c694d6-e399-4e86-b988-074eb76dd7c6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:04:04 crc kubenswrapper[4756]: I0318 14:04:04.932646 4756 patch_prober.go:28] interesting pod/controller-manager-8f976fbbb-kktkr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 14:04:04 crc kubenswrapper[4756]: I0318 14:04:04.932695 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" podUID="258e7cae-bb8d-43cb-afc6-a133c8a38678" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 14:04:06 crc kubenswrapper[4756]: I0318 14:04:06.915568 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:04:06 crc kubenswrapper[4756]: I0318 14:04:06.915645 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.391344 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.397857 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.429011 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54df6d659c-bmwb2"] Mar 18 14:04:08 crc kubenswrapper[4756]: E0318 14:04:08.429438 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb254a39-6093-4d92-aa6e-f3471c8bfcb9" containerName="pruner" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.429459 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb254a39-6093-4d92-aa6e-f3471c8bfcb9" containerName="pruner" Mar 18 14:04:08 crc kubenswrapper[4756]: E0318 14:04:08.429476 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258e7cae-bb8d-43cb-afc6-a133c8a38678" containerName="controller-manager" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.429487 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="258e7cae-bb8d-43cb-afc6-a133c8a38678" containerName="controller-manager" Mar 18 14:04:08 crc kubenswrapper[4756]: E0318 14:04:08.429506 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c694d6-e399-4e86-b988-074eb76dd7c6" containerName="route-controller-manager" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.429515 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c694d6-e399-4e86-b988-074eb76dd7c6" containerName="route-controller-manager" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.429673 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="258e7cae-bb8d-43cb-afc6-a133c8a38678" containerName="controller-manager" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.429695 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c694d6-e399-4e86-b988-074eb76dd7c6" containerName="route-controller-manager" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.429712 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb254a39-6093-4d92-aa6e-f3471c8bfcb9" containerName="pruner" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.430344 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.440179 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54df6d659c-bmwb2"] Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.558771 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfklh\" (UniqueName: \"kubernetes.io/projected/258e7cae-bb8d-43cb-afc6-a133c8a38678-kube-api-access-jfklh\") pod \"258e7cae-bb8d-43cb-afc6-a133c8a38678\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.558849 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84c694d6-e399-4e86-b988-074eb76dd7c6-serving-cert\") pod \"84c694d6-e399-4e86-b988-074eb76dd7c6\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.558889 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8pgx\" (UniqueName: \"kubernetes.io/projected/84c694d6-e399-4e86-b988-074eb76dd7c6-kube-api-access-q8pgx\") pod \"84c694d6-e399-4e86-b988-074eb76dd7c6\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.558995 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84c694d6-e399-4e86-b988-074eb76dd7c6-config\") pod \"84c694d6-e399-4e86-b988-074eb76dd7c6\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.559022 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/258e7cae-bb8d-43cb-afc6-a133c8a38678-serving-cert\") pod \"258e7cae-bb8d-43cb-afc6-a133c8a38678\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.559049 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-proxy-ca-bundles\") pod \"258e7cae-bb8d-43cb-afc6-a133c8a38678\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.559082 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-client-ca\") pod \"258e7cae-bb8d-43cb-afc6-a133c8a38678\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.559106 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84c694d6-e399-4e86-b988-074eb76dd7c6-client-ca\") pod \"84c694d6-e399-4e86-b988-074eb76dd7c6\" (UID: \"84c694d6-e399-4e86-b988-074eb76dd7c6\") " Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.559170 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-config\") pod \"258e7cae-bb8d-43cb-afc6-a133c8a38678\" (UID: \"258e7cae-bb8d-43cb-afc6-a133c8a38678\") " Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.559355 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87e836ce-e654-4069-9009-89b2d45d7d51-serving-cert\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.559405 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-client-ca\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.559469 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnnmw\" (UniqueName: \"kubernetes.io/projected/87e836ce-e654-4069-9009-89b2d45d7d51-kube-api-access-gnnmw\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.559500 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-proxy-ca-bundles\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.559532 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-config\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.559946 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-client-ca" (OuterVolumeSpecName: "client-ca") pod "258e7cae-bb8d-43cb-afc6-a133c8a38678" (UID: "258e7cae-bb8d-43cb-afc6-a133c8a38678"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.559939 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "258e7cae-bb8d-43cb-afc6-a133c8a38678" (UID: "258e7cae-bb8d-43cb-afc6-a133c8a38678"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.560142 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84c694d6-e399-4e86-b988-074eb76dd7c6-client-ca" (OuterVolumeSpecName: "client-ca") pod "84c694d6-e399-4e86-b988-074eb76dd7c6" (UID: "84c694d6-e399-4e86-b988-074eb76dd7c6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.560231 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84c694d6-e399-4e86-b988-074eb76dd7c6-config" (OuterVolumeSpecName: "config") pod "84c694d6-e399-4e86-b988-074eb76dd7c6" (UID: "84c694d6-e399-4e86-b988-074eb76dd7c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.560309 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-config" (OuterVolumeSpecName: "config") pod "258e7cae-bb8d-43cb-afc6-a133c8a38678" (UID: "258e7cae-bb8d-43cb-afc6-a133c8a38678"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.564991 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258e7cae-bb8d-43cb-afc6-a133c8a38678-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "258e7cae-bb8d-43cb-afc6-a133c8a38678" (UID: "258e7cae-bb8d-43cb-afc6-a133c8a38678"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.565002 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84c694d6-e399-4e86-b988-074eb76dd7c6-kube-api-access-q8pgx" (OuterVolumeSpecName: "kube-api-access-q8pgx") pod "84c694d6-e399-4e86-b988-074eb76dd7c6" (UID: "84c694d6-e399-4e86-b988-074eb76dd7c6"). InnerVolumeSpecName "kube-api-access-q8pgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.565319 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258e7cae-bb8d-43cb-afc6-a133c8a38678-kube-api-access-jfklh" (OuterVolumeSpecName: "kube-api-access-jfklh") pod "258e7cae-bb8d-43cb-afc6-a133c8a38678" (UID: "258e7cae-bb8d-43cb-afc6-a133c8a38678"). InnerVolumeSpecName "kube-api-access-jfklh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.565479 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c694d6-e399-4e86-b988-074eb76dd7c6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "84c694d6-e399-4e86-b988-074eb76dd7c6" (UID: "84c694d6-e399-4e86-b988-074eb76dd7c6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.660778 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-proxy-ca-bundles\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.661179 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-config\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.661704 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87e836ce-e654-4069-9009-89b2d45d7d51-serving-cert\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.661899 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-client-ca\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.662010 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-proxy-ca-bundles\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.662329 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnnmw\" (UniqueName: \"kubernetes.io/projected/87e836ce-e654-4069-9009-89b2d45d7d51-kube-api-access-gnnmw\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.662560 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84c694d6-e399-4e86-b988-074eb76dd7c6-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.662692 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/258e7cae-bb8d-43cb-afc6-a133c8a38678-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.662843 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.663050 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.663282 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84c694d6-e399-4e86-b988-074eb76dd7c6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.663433 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/258e7cae-bb8d-43cb-afc6-a133c8a38678-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.663577 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfklh\" (UniqueName: \"kubernetes.io/projected/258e7cae-bb8d-43cb-afc6-a133c8a38678-kube-api-access-jfklh\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.663715 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84c694d6-e399-4e86-b988-074eb76dd7c6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.663853 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8pgx\" (UniqueName: \"kubernetes.io/projected/84c694d6-e399-4e86-b988-074eb76dd7c6-kube-api-access-q8pgx\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.662777 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-client-ca\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.664023 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-config\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.667016 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87e836ce-e654-4069-9009-89b2d45d7d51-serving-cert\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.685964 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnnmw\" (UniqueName: \"kubernetes.io/projected/87e836ce-e654-4069-9009-89b2d45d7d51-kube-api-access-gnnmw\") pod \"controller-manager-54df6d659c-bmwb2\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.756383 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" event={"ID":"258e7cae-bb8d-43cb-afc6-a133c8a38678","Type":"ContainerDied","Data":"55681ec73f5decd825a5ed7a9d69457d63e62fb031d954eab1ca02952b3adc1e"} Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.756393 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8f976fbbb-kktkr" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.756732 4756 scope.go:117] "RemoveContainer" containerID="f5d0742169a5e3bf3f6aed116b7b6e7a8088e10c7415302841c779087a57d61c" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.757228 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.758624 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" event={"ID":"84c694d6-e399-4e86-b988-074eb76dd7c6","Type":"ContainerDied","Data":"43c57c24887feada5ccb22af1703f908d45d663ca5c42c39a08ee777fe443ed7"} Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.758728 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp" Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.782242 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8f976fbbb-kktkr"] Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.785344 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8f976fbbb-kktkr"] Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.796245 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp"] Mar 18 14:04:08 crc kubenswrapper[4756]: I0318 14:04:08.798830 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c978874dc-cllkp"] Mar 18 14:04:09 crc kubenswrapper[4756]: I0318 14:04:09.331881 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258e7cae-bb8d-43cb-afc6-a133c8a38678" path="/var/lib/kubelet/pods/258e7cae-bb8d-43cb-afc6-a133c8a38678/volumes" Mar 18 14:04:09 crc kubenswrapper[4756]: I0318 14:04:09.334252 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84c694d6-e399-4e86-b988-074eb76dd7c6" path="/var/lib/kubelet/pods/84c694d6-e399-4e86-b988-074eb76dd7c6/volumes" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.557452 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth"] Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.559565 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.563242 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.563708 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.563705 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.563922 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.564051 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.566390 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.576277 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth"] Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.716245 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22af9094-ffeb-4010-8f10-ddef3070a388-config\") pod \"route-controller-manager-56df5f54b6-8xdth\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.716321 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22af9094-ffeb-4010-8f10-ddef3070a388-serving-cert\") pod \"route-controller-manager-56df5f54b6-8xdth\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.716344 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22af9094-ffeb-4010-8f10-ddef3070a388-client-ca\") pod \"route-controller-manager-56df5f54b6-8xdth\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.716387 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ggtb\" (UniqueName: \"kubernetes.io/projected/22af9094-ffeb-4010-8f10-ddef3070a388-kube-api-access-4ggtb\") pod \"route-controller-manager-56df5f54b6-8xdth\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.817560 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22af9094-ffeb-4010-8f10-ddef3070a388-serving-cert\") pod \"route-controller-manager-56df5f54b6-8xdth\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.817616 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22af9094-ffeb-4010-8f10-ddef3070a388-client-ca\") pod \"route-controller-manager-56df5f54b6-8xdth\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.817683 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ggtb\" (UniqueName: \"kubernetes.io/projected/22af9094-ffeb-4010-8f10-ddef3070a388-kube-api-access-4ggtb\") pod \"route-controller-manager-56df5f54b6-8xdth\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.817772 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22af9094-ffeb-4010-8f10-ddef3070a388-config\") pod \"route-controller-manager-56df5f54b6-8xdth\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.818999 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22af9094-ffeb-4010-8f10-ddef3070a388-client-ca\") pod \"route-controller-manager-56df5f54b6-8xdth\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.819377 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22af9094-ffeb-4010-8f10-ddef3070a388-config\") pod \"route-controller-manager-56df5f54b6-8xdth\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.823845 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22af9094-ffeb-4010-8f10-ddef3070a388-serving-cert\") pod \"route-controller-manager-56df5f54b6-8xdth\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.838233 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ggtb\" (UniqueName: \"kubernetes.io/projected/22af9094-ffeb-4010-8f10-ddef3070a388-kube-api-access-4ggtb\") pod \"route-controller-manager-56df5f54b6-8xdth\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:12 crc kubenswrapper[4756]: I0318 14:04:12.880612 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:13 crc kubenswrapper[4756]: I0318 14:04:13.207546 4756 ???:1] "http: TLS handshake error from 192.168.126.11:58842: no serving certificate available for the kubelet" Mar 18 14:04:13 crc kubenswrapper[4756]: E0318 14:04:13.536349 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 14:04:13 crc kubenswrapper[4756]: E0318 14:04:13.536505 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5gcxc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jmwzc_openshift-marketplace(c76033c1-1ccb-42ce-ade9-f46428bc0b46): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 14:04:13 crc kubenswrapper[4756]: E0318 14:04:13.537799 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jmwzc" podUID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" Mar 18 14:04:13 crc kubenswrapper[4756]: E0318 14:04:13.547688 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 14:04:13 crc kubenswrapper[4756]: E0318 14:04:13.547850 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-22d5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-w28cp_openshift-marketplace(7bb3189f-716d-4fef-b885-3a031a60d981): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 14:04:13 crc kubenswrapper[4756]: E0318 14:04:13.549282 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-w28cp" podUID="7bb3189f-716d-4fef-b885-3a031a60d981" Mar 18 14:04:15 crc kubenswrapper[4756]: E0318 14:04:15.407017 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jmwzc" podUID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" Mar 18 14:04:15 crc kubenswrapper[4756]: E0318 14:04:15.407379 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-w28cp" podUID="7bb3189f-716d-4fef-b885-3a031a60d981" Mar 18 14:04:15 crc kubenswrapper[4756]: E0318 14:04:15.475400 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 14:04:15 crc kubenswrapper[4756]: E0318 14:04:15.475561 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vzbbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-tfqzk_openshift-marketplace(fb1afba2-1ba1-43c6-9a0c-740f6504fa8b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 14:04:15 crc kubenswrapper[4756]: E0318 14:04:15.476795 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-tfqzk" podUID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" Mar 18 14:04:15 crc kubenswrapper[4756]: E0318 14:04:15.491522 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 14:04:15 crc kubenswrapper[4756]: E0318 14:04:15.491684 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zb2h9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lkr5r_openshift-marketplace(48602255-9809-498e-9c4a-6053ba5ff591): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 14:04:15 crc kubenswrapper[4756]: E0318 14:04:15.493072 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lkr5r" podUID="48602255-9809-498e-9c4a-6053ba5ff591" Mar 18 14:04:15 crc kubenswrapper[4756]: I0318 14:04:15.666436 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm8qz" Mar 18 14:04:16 crc kubenswrapper[4756]: E0318 14:04:16.627898 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-tfqzk" podUID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" Mar 18 14:04:16 crc kubenswrapper[4756]: E0318 14:04:16.627946 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lkr5r" podUID="48602255-9809-498e-9c4a-6053ba5ff591" Mar 18 14:04:16 crc kubenswrapper[4756]: E0318 14:04:16.687289 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 14:04:16 crc kubenswrapper[4756]: E0318 14:04:16.687418 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-56d8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9hrhl_openshift-marketplace(caee1439-b7bb-456e-982f-1c3c3cdb51c3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 14:04:16 crc kubenswrapper[4756]: E0318 14:04:16.688601 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9hrhl" podUID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" Mar 18 14:04:17 crc kubenswrapper[4756]: I0318 14:04:17.897892 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 14:04:17 crc kubenswrapper[4756]: I0318 14:04:17.899161 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 14:04:17 crc kubenswrapper[4756]: I0318 14:04:17.902235 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 14:04:17 crc kubenswrapper[4756]: I0318 14:04:17.902529 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 14:04:17 crc kubenswrapper[4756]: I0318 14:04:17.904252 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 14:04:18 crc kubenswrapper[4756]: I0318 14:04:18.085386 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f048e4c3-268c-4c11-9052-86ac12dbd601-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f048e4c3-268c-4c11-9052-86ac12dbd601\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 14:04:18 crc kubenswrapper[4756]: I0318 14:04:18.085457 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f048e4c3-268c-4c11-9052-86ac12dbd601-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f048e4c3-268c-4c11-9052-86ac12dbd601\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 14:04:18 crc kubenswrapper[4756]: I0318 14:04:18.201237 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f048e4c3-268c-4c11-9052-86ac12dbd601-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f048e4c3-268c-4c11-9052-86ac12dbd601\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 14:04:18 crc kubenswrapper[4756]: I0318 14:04:18.201304 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f048e4c3-268c-4c11-9052-86ac12dbd601-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f048e4c3-268c-4c11-9052-86ac12dbd601\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 14:04:18 crc kubenswrapper[4756]: I0318 14:04:18.201390 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f048e4c3-268c-4c11-9052-86ac12dbd601-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f048e4c3-268c-4c11-9052-86ac12dbd601\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 14:04:18 crc kubenswrapper[4756]: I0318 14:04:18.219110 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f048e4c3-268c-4c11-9052-86ac12dbd601-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f048e4c3-268c-4c11-9052-86ac12dbd601\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 14:04:18 crc kubenswrapper[4756]: I0318 14:04:18.230018 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 14:04:19 crc kubenswrapper[4756]: E0318 14:04:19.709921 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9hrhl" podUID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" Mar 18 14:04:19 crc kubenswrapper[4756]: E0318 14:04:19.743765 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 14:04:19 crc kubenswrapper[4756]: E0318 14:04:19.743967 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pt6kl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cn8ht_openshift-marketplace(d5a6ae3a-3fd6-4254-9ca5-8654eee53b55): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 14:04:19 crc kubenswrapper[4756]: E0318 14:04:19.745178 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cn8ht" podUID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" Mar 18 14:04:19 crc kubenswrapper[4756]: I0318 14:04:19.788382 4756 scope.go:117] "RemoveContainer" containerID="79f141a1e9c74017c1590e25b98f1a1968d9b47a69f4f37cd56d6c4646b21d71" Mar 18 14:04:19 crc kubenswrapper[4756]: E0318 14:04:19.793441 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 14:04:19 crc kubenswrapper[4756]: E0318 14:04:19.793578 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8jcxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gs6l5_openshift-marketplace(af462049-61c3-4da5-aeb0-0311404c4741): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 14:04:19 crc kubenswrapper[4756]: E0318 14:04:19.794942 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gs6l5" podUID="af462049-61c3-4da5-aeb0-0311404c4741" Mar 18 14:04:19 crc kubenswrapper[4756]: E0318 14:04:19.823641 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-gs6l5" podUID="af462049-61c3-4da5-aeb0-0311404c4741" Mar 18 14:04:19 crc kubenswrapper[4756]: E0318 14:04:19.890416 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cn8ht" podUID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.015998 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gfdtl"] Mar 18 14:04:20 crc kubenswrapper[4756]: W0318 14:04:20.024488 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc11d2088_741c_4812_8eb2_ccfc3d0c7d11.slice/crio-0cbb098da705587eaf7dca1dd66d47ef70a569de033997a4b6916584ec5d6a05 WatchSource:0}: Error finding container 0cbb098da705587eaf7dca1dd66d47ef70a569de033997a4b6916584ec5d6a05: Status 404 returned error can't find the container with id 0cbb098da705587eaf7dca1dd66d47ef70a569de033997a4b6916584ec5d6a05 Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.173269 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564044-fs4jl"] Mar 18 14:04:20 crc kubenswrapper[4756]: W0318 14:04:20.180485 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f367985_a362_46d8_8dab_205cb7756e9e.slice/crio-53fac7cc1392cde921e0bbb4481d8a4d340cc5c815885ec26a835dea6c18ae7e WatchSource:0}: Error finding container 53fac7cc1392cde921e0bbb4481d8a4d340cc5c815885ec26a835dea6c18ae7e: Status 404 returned error can't find the container with id 53fac7cc1392cde921e0bbb4481d8a4d340cc5c815885ec26a835dea6c18ae7e Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.274607 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54df6d659c-bmwb2"] Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.282368 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 14:04:20 crc kubenswrapper[4756]: W0318 14:04:20.292441 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87e836ce_e654_4069_9009_89b2d45d7d51.slice/crio-4f5b35ddac8a6577117cb7dc8496b2de904cbd820cf16740e84e3eca2f169d34 WatchSource:0}: Error finding container 4f5b35ddac8a6577117cb7dc8496b2de904cbd820cf16740e84e3eca2f169d34: Status 404 returned error can't find the container with id 4f5b35ddac8a6577117cb7dc8496b2de904cbd820cf16740e84e3eca2f169d34 Mar 18 14:04:20 crc kubenswrapper[4756]: W0318 14:04:20.294646 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf048e4c3_268c_4c11_9052_86ac12dbd601.slice/crio-1dc5738afa2747686b763fa700acb321676182eaf2383b5e2c70b1fa00d43ed9 WatchSource:0}: Error finding container 1dc5738afa2747686b763fa700acb321676182eaf2383b5e2c70b1fa00d43ed9: Status 404 returned error can't find the container with id 1dc5738afa2747686b763fa700acb321676182eaf2383b5e2c70b1fa00d43ed9 Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.296091 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth"] Mar 18 14:04:20 crc kubenswrapper[4756]: W0318 14:04:20.306743 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22af9094_ffeb_4010_8f10_ddef3070a388.slice/crio-3d4d82dd0f607b67ea0c272a3f6fa833cee7249a1d2e764a018caeccd1749f46 WatchSource:0}: Error finding container 3d4d82dd0f607b67ea0c272a3f6fa833cee7249a1d2e764a018caeccd1749f46: Status 404 returned error can't find the container with id 3d4d82dd0f607b67ea0c272a3f6fa833cee7249a1d2e764a018caeccd1749f46 Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.530320 4756 csr.go:261] certificate signing request csr-kfdsq is approved, waiting to be issued Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.537171 4756 csr.go:257] certificate signing request csr-kfdsq is issued Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.572849 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54df6d659c-bmwb2"] Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.683723 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth"] Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.822433 4756 generic.go:334] "Generic (PLEG): container finished" podID="bfa461e5-a4e9-4cfa-a279-df6d4a56c973" containerID="edc3229d8de13bc3179b6b88bae3146b8c256dadefa49b635551df9aeb16fa95" exitCode=0 Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.822528 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564042-6mtlx" event={"ID":"bfa461e5-a4e9-4cfa-a279-df6d4a56c973","Type":"ContainerDied","Data":"edc3229d8de13bc3179b6b88bae3146b8c256dadefa49b635551df9aeb16fa95"} Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.824064 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" event={"ID":"87e836ce-e654-4069-9009-89b2d45d7d51","Type":"ContainerStarted","Data":"d124bdff9e1d9a1adf357852896b5fdb48ec6f1a19ab88ab55a23dd72d15f991"} Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.824130 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" event={"ID":"87e836ce-e654-4069-9009-89b2d45d7d51","Type":"ContainerStarted","Data":"4f5b35ddac8a6577117cb7dc8496b2de904cbd820cf16740e84e3eca2f169d34"} Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.824421 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.825168 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564044-fs4jl" event={"ID":"0f367985-a362-46d8-8dab-205cb7756e9e","Type":"ContainerStarted","Data":"53fac7cc1392cde921e0bbb4481d8a4d340cc5c815885ec26a835dea6c18ae7e"} Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.826940 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" event={"ID":"22af9094-ffeb-4010-8f10-ddef3070a388","Type":"ContainerStarted","Data":"82bb12de62d979e9e68f0b90e53cdbac94a98d29d7c2606cd9d713ce8b82b453"} Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.826968 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" event={"ID":"22af9094-ffeb-4010-8f10-ddef3070a388","Type":"ContainerStarted","Data":"3d4d82dd0f607b67ea0c272a3f6fa833cee7249a1d2e764a018caeccd1749f46"} Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.827193 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.829744 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" event={"ID":"c11d2088-741c-4812-8eb2-ccfc3d0c7d11","Type":"ContainerStarted","Data":"cb71378c43120f2fb303f19348a5b8ca8b5daf220ab855ab4fff024343b9705b"} Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.829789 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" event={"ID":"c11d2088-741c-4812-8eb2-ccfc3d0c7d11","Type":"ContainerStarted","Data":"b531574c267a01c672dee3bd7411502e7fe3f0321db6b4bfe27091ca94664ab6"} Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.829804 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gfdtl" event={"ID":"c11d2088-741c-4812-8eb2-ccfc3d0c7d11","Type":"ContainerStarted","Data":"0cbb098da705587eaf7dca1dd66d47ef70a569de033997a4b6916584ec5d6a05"} Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.846098 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f048e4c3-268c-4c11-9052-86ac12dbd601","Type":"ContainerStarted","Data":"41f2ae6f07c148c11f59094072a2cd9cc0b482d27bfba6dd9d2038ae20770d94"} Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.846173 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f048e4c3-268c-4c11-9052-86ac12dbd601","Type":"ContainerStarted","Data":"1dc5738afa2747686b763fa700acb321676182eaf2383b5e2c70b1fa00d43ed9"} Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.848499 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.849019 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.855463 4756 generic.go:334] "Generic (PLEG): container finished" podID="604a2bca-c232-4242-a46c-31630b85585d" containerID="3ae0906fece88bf293fd0af9440c5685856af18e188f41744bd430dc55c90e99" exitCode=0 Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.855522 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhpzd" event={"ID":"604a2bca-c232-4242-a46c-31630b85585d","Type":"ContainerDied","Data":"3ae0906fece88bf293fd0af9440c5685856af18e188f41744bd430dc55c90e99"} Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.865032 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" podStartSLOduration=20.865009872999998 podStartE2EDuration="20.865009873s" podCreationTimestamp="2026-03-18 14:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:04:20.86116846 +0000 UTC m=+262.175586445" watchObservedRunningTime="2026-03-18 14:04:20.865009873 +0000 UTC m=+262.179427848" Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.877569 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" podStartSLOduration=20.877548829 podStartE2EDuration="20.877548829s" podCreationTimestamp="2026-03-18 14:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:04:20.875267468 +0000 UTC m=+262.189685433" watchObservedRunningTime="2026-03-18 14:04:20.877548829 +0000 UTC m=+262.191966804" Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.892859 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gfdtl" podStartSLOduration=204.892843209 podStartE2EDuration="3m24.892843209s" podCreationTimestamp="2026-03-18 14:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:04:20.890940348 +0000 UTC m=+262.205358323" watchObservedRunningTime="2026-03-18 14:04:20.892843209 +0000 UTC m=+262.207261184" Mar 18 14:04:20 crc kubenswrapper[4756]: I0318 14:04:20.966792 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=3.966775341 podStartE2EDuration="3.966775341s" podCreationTimestamp="2026-03-18 14:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:04:20.963336919 +0000 UTC m=+262.277754894" watchObservedRunningTime="2026-03-18 14:04:20.966775341 +0000 UTC m=+262.281193326" Mar 18 14:04:21 crc kubenswrapper[4756]: I0318 14:04:21.539164 4756 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-05 16:39:10.867234069 +0000 UTC Mar 18 14:04:21 crc kubenswrapper[4756]: I0318 14:04:21.539646 4756 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6290h34m49.32759394s for next certificate rotation Mar 18 14:04:21 crc kubenswrapper[4756]: I0318 14:04:21.863377 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhpzd" event={"ID":"604a2bca-c232-4242-a46c-31630b85585d","Type":"ContainerStarted","Data":"7fae7af2af82ed351fcb21acf573f2ec73c1c5d4f72176108f1969b485859a2f"} Mar 18 14:04:21 crc kubenswrapper[4756]: I0318 14:04:21.866265 4756 generic.go:334] "Generic (PLEG): container finished" podID="0f367985-a362-46d8-8dab-205cb7756e9e" containerID="db41a032f116f2ee9e8f001ee85ed6982558149808fd2bac78fad8880a53578e" exitCode=0 Mar 18 14:04:21 crc kubenswrapper[4756]: I0318 14:04:21.866308 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564044-fs4jl" event={"ID":"0f367985-a362-46d8-8dab-205cb7756e9e","Type":"ContainerDied","Data":"db41a032f116f2ee9e8f001ee85ed6982558149808fd2bac78fad8880a53578e"} Mar 18 14:04:21 crc kubenswrapper[4756]: I0318 14:04:21.868726 4756 generic.go:334] "Generic (PLEG): container finished" podID="f048e4c3-268c-4c11-9052-86ac12dbd601" containerID="41f2ae6f07c148c11f59094072a2cd9cc0b482d27bfba6dd9d2038ae20770d94" exitCode=0 Mar 18 14:04:21 crc kubenswrapper[4756]: I0318 14:04:21.868864 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f048e4c3-268c-4c11-9052-86ac12dbd601","Type":"ContainerDied","Data":"41f2ae6f07c148c11f59094072a2cd9cc0b482d27bfba6dd9d2038ae20770d94"} Mar 18 14:04:21 crc kubenswrapper[4756]: I0318 14:04:21.868997 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" podUID="22af9094-ffeb-4010-8f10-ddef3070a388" containerName="route-controller-manager" containerID="cri-o://82bb12de62d979e9e68f0b90e53cdbac94a98d29d7c2606cd9d713ce8b82b453" gracePeriod=30 Mar 18 14:04:21 crc kubenswrapper[4756]: I0318 14:04:21.869261 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" podUID="87e836ce-e654-4069-9009-89b2d45d7d51" containerName="controller-manager" containerID="cri-o://d124bdff9e1d9a1adf357852896b5fdb48ec6f1a19ab88ab55a23dd72d15f991" gracePeriod=30 Mar 18 14:04:21 crc kubenswrapper[4756]: I0318 14:04:21.896380 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dhpzd" podStartSLOduration=3.113835537 podStartE2EDuration="38.896356032s" podCreationTimestamp="2026-03-18 14:03:43 +0000 UTC" firstStartedPulling="2026-03-18 14:03:45.462799114 +0000 UTC m=+226.777217089" lastFinishedPulling="2026-03-18 14:04:21.245319609 +0000 UTC m=+262.559737584" observedRunningTime="2026-03-18 14:04:21.894457721 +0000 UTC m=+263.208875736" watchObservedRunningTime="2026-03-18 14:04:21.896356032 +0000 UTC m=+263.210774017" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.136763 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564042-6mtlx" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.157351 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q42gv\" (UniqueName: \"kubernetes.io/projected/bfa461e5-a4e9-4cfa-a279-df6d4a56c973-kube-api-access-q42gv\") pod \"bfa461e5-a4e9-4cfa-a279-df6d4a56c973\" (UID: \"bfa461e5-a4e9-4cfa-a279-df6d4a56c973\") " Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.165600 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfa461e5-a4e9-4cfa-a279-df6d4a56c973-kube-api-access-q42gv" (OuterVolumeSpecName: "kube-api-access-q42gv") pod "bfa461e5-a4e9-4cfa-a279-df6d4a56c973" (UID: "bfa461e5-a4e9-4cfa-a279-df6d4a56c973"). InnerVolumeSpecName "kube-api-access-q42gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.258961 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q42gv\" (UniqueName: \"kubernetes.io/projected/bfa461e5-a4e9-4cfa-a279-df6d4a56c973-kube-api-access-q42gv\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.265300 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.285304 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.460761 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-proxy-ca-bundles\") pod \"87e836ce-e654-4069-9009-89b2d45d7d51\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.460847 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22af9094-ffeb-4010-8f10-ddef3070a388-config\") pod \"22af9094-ffeb-4010-8f10-ddef3070a388\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.460889 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22af9094-ffeb-4010-8f10-ddef3070a388-serving-cert\") pod \"22af9094-ffeb-4010-8f10-ddef3070a388\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.460922 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22af9094-ffeb-4010-8f10-ddef3070a388-client-ca\") pod \"22af9094-ffeb-4010-8f10-ddef3070a388\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.460953 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ggtb\" (UniqueName: \"kubernetes.io/projected/22af9094-ffeb-4010-8f10-ddef3070a388-kube-api-access-4ggtb\") pod \"22af9094-ffeb-4010-8f10-ddef3070a388\" (UID: \"22af9094-ffeb-4010-8f10-ddef3070a388\") " Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.460981 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-client-ca\") pod \"87e836ce-e654-4069-9009-89b2d45d7d51\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.461011 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-config\") pod \"87e836ce-e654-4069-9009-89b2d45d7d51\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.461031 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87e836ce-e654-4069-9009-89b2d45d7d51-serving-cert\") pod \"87e836ce-e654-4069-9009-89b2d45d7d51\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.461061 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnnmw\" (UniqueName: \"kubernetes.io/projected/87e836ce-e654-4069-9009-89b2d45d7d51-kube-api-access-gnnmw\") pod \"87e836ce-e654-4069-9009-89b2d45d7d51\" (UID: \"87e836ce-e654-4069-9009-89b2d45d7d51\") " Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.461681 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "87e836ce-e654-4069-9009-89b2d45d7d51" (UID: "87e836ce-e654-4069-9009-89b2d45d7d51"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.461810 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22af9094-ffeb-4010-8f10-ddef3070a388-config" (OuterVolumeSpecName: "config") pod "22af9094-ffeb-4010-8f10-ddef3070a388" (UID: "22af9094-ffeb-4010-8f10-ddef3070a388"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.461908 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22af9094-ffeb-4010-8f10-ddef3070a388-client-ca" (OuterVolumeSpecName: "client-ca") pod "22af9094-ffeb-4010-8f10-ddef3070a388" (UID: "22af9094-ffeb-4010-8f10-ddef3070a388"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.462036 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-config" (OuterVolumeSpecName: "config") pod "87e836ce-e654-4069-9009-89b2d45d7d51" (UID: "87e836ce-e654-4069-9009-89b2d45d7d51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.462265 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-client-ca" (OuterVolumeSpecName: "client-ca") pod "87e836ce-e654-4069-9009-89b2d45d7d51" (UID: "87e836ce-e654-4069-9009-89b2d45d7d51"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.464223 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22af9094-ffeb-4010-8f10-ddef3070a388-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "22af9094-ffeb-4010-8f10-ddef3070a388" (UID: "22af9094-ffeb-4010-8f10-ddef3070a388"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.464251 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e836ce-e654-4069-9009-89b2d45d7d51-kube-api-access-gnnmw" (OuterVolumeSpecName: "kube-api-access-gnnmw") pod "87e836ce-e654-4069-9009-89b2d45d7d51" (UID: "87e836ce-e654-4069-9009-89b2d45d7d51"). InnerVolumeSpecName "kube-api-access-gnnmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.464314 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e836ce-e654-4069-9009-89b2d45d7d51-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "87e836ce-e654-4069-9009-89b2d45d7d51" (UID: "87e836ce-e654-4069-9009-89b2d45d7d51"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.464405 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22af9094-ffeb-4010-8f10-ddef3070a388-kube-api-access-4ggtb" (OuterVolumeSpecName: "kube-api-access-4ggtb") pod "22af9094-ffeb-4010-8f10-ddef3070a388" (UID: "22af9094-ffeb-4010-8f10-ddef3070a388"). InnerVolumeSpecName "kube-api-access-4ggtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.540174 4756 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-11 03:18:24.453530461 +0000 UTC Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.540216 4756 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7165h14m1.913317238s for next certificate rotation Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.562642 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.562686 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.562699 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87e836ce-e654-4069-9009-89b2d45d7d51-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.562710 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnnmw\" (UniqueName: \"kubernetes.io/projected/87e836ce-e654-4069-9009-89b2d45d7d51-kube-api-access-gnnmw\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.562723 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87e836ce-e654-4069-9009-89b2d45d7d51-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.562734 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22af9094-ffeb-4010-8f10-ddef3070a388-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.562745 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22af9094-ffeb-4010-8f10-ddef3070a388-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.562756 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22af9094-ffeb-4010-8f10-ddef3070a388-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.562767 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ggtb\" (UniqueName: \"kubernetes.io/projected/22af9094-ffeb-4010-8f10-ddef3070a388-kube-api-access-4ggtb\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.876494 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564042-6mtlx" event={"ID":"bfa461e5-a4e9-4cfa-a279-df6d4a56c973","Type":"ContainerDied","Data":"f8c5af48c5c990f9985ba3860bf98684cab494c7f98cef18b658e6d02972a16c"} Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.876543 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8c5af48c5c990f9985ba3860bf98684cab494c7f98cef18b658e6d02972a16c" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.876515 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564042-6mtlx" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.878311 4756 generic.go:334] "Generic (PLEG): container finished" podID="87e836ce-e654-4069-9009-89b2d45d7d51" containerID="d124bdff9e1d9a1adf357852896b5fdb48ec6f1a19ab88ab55a23dd72d15f991" exitCode=0 Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.878355 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.878414 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" event={"ID":"87e836ce-e654-4069-9009-89b2d45d7d51","Type":"ContainerDied","Data":"d124bdff9e1d9a1adf357852896b5fdb48ec6f1a19ab88ab55a23dd72d15f991"} Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.878492 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54df6d659c-bmwb2" event={"ID":"87e836ce-e654-4069-9009-89b2d45d7d51","Type":"ContainerDied","Data":"4f5b35ddac8a6577117cb7dc8496b2de904cbd820cf16740e84e3eca2f169d34"} Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.878516 4756 scope.go:117] "RemoveContainer" containerID="d124bdff9e1d9a1adf357852896b5fdb48ec6f1a19ab88ab55a23dd72d15f991" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.879719 4756 generic.go:334] "Generic (PLEG): container finished" podID="22af9094-ffeb-4010-8f10-ddef3070a388" containerID="82bb12de62d979e9e68f0b90e53cdbac94a98d29d7c2606cd9d713ce8b82b453" exitCode=0 Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.879771 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.879807 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" event={"ID":"22af9094-ffeb-4010-8f10-ddef3070a388","Type":"ContainerDied","Data":"82bb12de62d979e9e68f0b90e53cdbac94a98d29d7c2606cd9d713ce8b82b453"} Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.879837 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth" event={"ID":"22af9094-ffeb-4010-8f10-ddef3070a388","Type":"ContainerDied","Data":"3d4d82dd0f607b67ea0c272a3f6fa833cee7249a1d2e764a018caeccd1749f46"} Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.913763 4756 scope.go:117] "RemoveContainer" containerID="d124bdff9e1d9a1adf357852896b5fdb48ec6f1a19ab88ab55a23dd72d15f991" Mar 18 14:04:22 crc kubenswrapper[4756]: E0318 14:04:22.917667 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d124bdff9e1d9a1adf357852896b5fdb48ec6f1a19ab88ab55a23dd72d15f991\": container with ID starting with d124bdff9e1d9a1adf357852896b5fdb48ec6f1a19ab88ab55a23dd72d15f991 not found: ID does not exist" containerID="d124bdff9e1d9a1adf357852896b5fdb48ec6f1a19ab88ab55a23dd72d15f991" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.917745 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d124bdff9e1d9a1adf357852896b5fdb48ec6f1a19ab88ab55a23dd72d15f991"} err="failed to get container status \"d124bdff9e1d9a1adf357852896b5fdb48ec6f1a19ab88ab55a23dd72d15f991\": rpc error: code = NotFound desc = could not find container \"d124bdff9e1d9a1adf357852896b5fdb48ec6f1a19ab88ab55a23dd72d15f991\": container with ID starting with d124bdff9e1d9a1adf357852896b5fdb48ec6f1a19ab88ab55a23dd72d15f991 not found: ID does not exist" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.917784 4756 scope.go:117] "RemoveContainer" containerID="82bb12de62d979e9e68f0b90e53cdbac94a98d29d7c2606cd9d713ce8b82b453" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.931669 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54df6d659c-bmwb2"] Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.939269 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54df6d659c-bmwb2"] Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.939463 4756 scope.go:117] "RemoveContainer" containerID="82bb12de62d979e9e68f0b90e53cdbac94a98d29d7c2606cd9d713ce8b82b453" Mar 18 14:04:22 crc kubenswrapper[4756]: E0318 14:04:22.940033 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82bb12de62d979e9e68f0b90e53cdbac94a98d29d7c2606cd9d713ce8b82b453\": container with ID starting with 82bb12de62d979e9e68f0b90e53cdbac94a98d29d7c2606cd9d713ce8b82b453 not found: ID does not exist" containerID="82bb12de62d979e9e68f0b90e53cdbac94a98d29d7c2606cd9d713ce8b82b453" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.940065 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82bb12de62d979e9e68f0b90e53cdbac94a98d29d7c2606cd9d713ce8b82b453"} err="failed to get container status \"82bb12de62d979e9e68f0b90e53cdbac94a98d29d7c2606cd9d713ce8b82b453\": rpc error: code = NotFound desc = could not find container \"82bb12de62d979e9e68f0b90e53cdbac94a98d29d7c2606cd9d713ce8b82b453\": container with ID starting with 82bb12de62d979e9e68f0b90e53cdbac94a98d29d7c2606cd9d713ce8b82b453 not found: ID does not exist" Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.943333 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth"] Mar 18 14:04:22 crc kubenswrapper[4756]: I0318 14:04:22.946436 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56df5f54b6-8xdth"] Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.128335 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.132708 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564044-fs4jl" Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.272491 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdmt6\" (UniqueName: \"kubernetes.io/projected/0f367985-a362-46d8-8dab-205cb7756e9e-kube-api-access-fdmt6\") pod \"0f367985-a362-46d8-8dab-205cb7756e9e\" (UID: \"0f367985-a362-46d8-8dab-205cb7756e9e\") " Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.272561 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f048e4c3-268c-4c11-9052-86ac12dbd601-kube-api-access\") pod \"f048e4c3-268c-4c11-9052-86ac12dbd601\" (UID: \"f048e4c3-268c-4c11-9052-86ac12dbd601\") " Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.272586 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f048e4c3-268c-4c11-9052-86ac12dbd601-kubelet-dir\") pod \"f048e4c3-268c-4c11-9052-86ac12dbd601\" (UID: \"f048e4c3-268c-4c11-9052-86ac12dbd601\") " Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.272736 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f048e4c3-268c-4c11-9052-86ac12dbd601-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f048e4c3-268c-4c11-9052-86ac12dbd601" (UID: "f048e4c3-268c-4c11-9052-86ac12dbd601"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.272871 4756 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f048e4c3-268c-4c11-9052-86ac12dbd601-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.277487 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f367985-a362-46d8-8dab-205cb7756e9e-kube-api-access-fdmt6" (OuterVolumeSpecName: "kube-api-access-fdmt6") pod "0f367985-a362-46d8-8dab-205cb7756e9e" (UID: "0f367985-a362-46d8-8dab-205cb7756e9e"). InnerVolumeSpecName "kube-api-access-fdmt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.282869 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f048e4c3-268c-4c11-9052-86ac12dbd601-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f048e4c3-268c-4c11-9052-86ac12dbd601" (UID: "f048e4c3-268c-4c11-9052-86ac12dbd601"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.324425 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22af9094-ffeb-4010-8f10-ddef3070a388" path="/var/lib/kubelet/pods/22af9094-ffeb-4010-8f10-ddef3070a388/volumes" Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.325276 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87e836ce-e654-4069-9009-89b2d45d7d51" path="/var/lib/kubelet/pods/87e836ce-e654-4069-9009-89b2d45d7d51/volumes" Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.374234 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdmt6\" (UniqueName: \"kubernetes.io/projected/0f367985-a362-46d8-8dab-205cb7756e9e-kube-api-access-fdmt6\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.374270 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f048e4c3-268c-4c11-9052-86ac12dbd601-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.892349 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564044-fs4jl" Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.892299 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564044-fs4jl" event={"ID":"0f367985-a362-46d8-8dab-205cb7756e9e","Type":"ContainerDied","Data":"53fac7cc1392cde921e0bbb4481d8a4d340cc5c815885ec26a835dea6c18ae7e"} Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.892438 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53fac7cc1392cde921e0bbb4481d8a4d340cc5c815885ec26a835dea6c18ae7e" Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.895610 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f048e4c3-268c-4c11-9052-86ac12dbd601","Type":"ContainerDied","Data":"1dc5738afa2747686b763fa700acb321676182eaf2383b5e2c70b1fa00d43ed9"} Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.895654 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dc5738afa2747686b763fa700acb321676182eaf2383b5e2c70b1fa00d43ed9" Mar 18 14:04:23 crc kubenswrapper[4756]: I0318 14:04:23.895041 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.053929 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.054003 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.215883 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.566741 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7556c9649b-qttmn"] Mar 18 14:04:24 crc kubenswrapper[4756]: E0318 14:04:24.567047 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22af9094-ffeb-4010-8f10-ddef3070a388" containerName="route-controller-manager" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.567070 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="22af9094-ffeb-4010-8f10-ddef3070a388" containerName="route-controller-manager" Mar 18 14:04:24 crc kubenswrapper[4756]: E0318 14:04:24.567086 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f367985-a362-46d8-8dab-205cb7756e9e" containerName="oc" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.567097 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f367985-a362-46d8-8dab-205cb7756e9e" containerName="oc" Mar 18 14:04:24 crc kubenswrapper[4756]: E0318 14:04:24.567113 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e836ce-e654-4069-9009-89b2d45d7d51" containerName="controller-manager" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.567151 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e836ce-e654-4069-9009-89b2d45d7d51" containerName="controller-manager" Mar 18 14:04:24 crc kubenswrapper[4756]: E0318 14:04:24.567161 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa461e5-a4e9-4cfa-a279-df6d4a56c973" containerName="oc" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.567169 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa461e5-a4e9-4cfa-a279-df6d4a56c973" containerName="oc" Mar 18 14:04:24 crc kubenswrapper[4756]: E0318 14:04:24.567186 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f048e4c3-268c-4c11-9052-86ac12dbd601" containerName="pruner" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.567195 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f048e4c3-268c-4c11-9052-86ac12dbd601" containerName="pruner" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.567308 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="87e836ce-e654-4069-9009-89b2d45d7d51" containerName="controller-manager" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.567321 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f048e4c3-268c-4c11-9052-86ac12dbd601" containerName="pruner" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.567336 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa461e5-a4e9-4cfa-a279-df6d4a56c973" containerName="oc" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.567346 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="22af9094-ffeb-4010-8f10-ddef3070a388" containerName="route-controller-manager" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.567358 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f367985-a362-46d8-8dab-205cb7756e9e" containerName="oc" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.567843 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.570142 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.570459 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w"] Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.571515 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.572369 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.573765 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.573939 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.574170 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.574323 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.574532 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.574867 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.575329 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.575679 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.575772 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.575904 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w"] Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.577472 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.579462 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.580218 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7556c9649b-qttmn"] Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.594463 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-proxy-ca-bundles\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.594508 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgmsn\" (UniqueName: \"kubernetes.io/projected/f909523c-3c43-495a-ad45-81fcc03445fb-kube-api-access-wgmsn\") pod \"route-controller-manager-6d7787d89-s9n5w\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.594534 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/437026fe-8376-43fa-899e-4cd0f25468c0-serving-cert\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.594561 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f909523c-3c43-495a-ad45-81fcc03445fb-config\") pod \"route-controller-manager-6d7787d89-s9n5w\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.594579 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-client-ca\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.594665 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f909523c-3c43-495a-ad45-81fcc03445fb-serving-cert\") pod \"route-controller-manager-6d7787d89-s9n5w\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.594695 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-config\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.594720 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d828f\" (UniqueName: \"kubernetes.io/projected/437026fe-8376-43fa-899e-4cd0f25468c0-kube-api-access-d828f\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.594790 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f909523c-3c43-495a-ad45-81fcc03445fb-client-ca\") pod \"route-controller-manager-6d7787d89-s9n5w\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.696381 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f909523c-3c43-495a-ad45-81fcc03445fb-client-ca\") pod \"route-controller-manager-6d7787d89-s9n5w\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.696482 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-proxy-ca-bundles\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.696541 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmsn\" (UniqueName: \"kubernetes.io/projected/f909523c-3c43-495a-ad45-81fcc03445fb-kube-api-access-wgmsn\") pod \"route-controller-manager-6d7787d89-s9n5w\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.696584 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/437026fe-8376-43fa-899e-4cd0f25468c0-serving-cert\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.696624 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f909523c-3c43-495a-ad45-81fcc03445fb-config\") pod \"route-controller-manager-6d7787d89-s9n5w\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.696648 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-client-ca\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.696674 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f909523c-3c43-495a-ad45-81fcc03445fb-serving-cert\") pod \"route-controller-manager-6d7787d89-s9n5w\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.696718 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-config\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.696741 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d828f\" (UniqueName: \"kubernetes.io/projected/437026fe-8376-43fa-899e-4cd0f25468c0-kube-api-access-d828f\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.697600 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-client-ca\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.697730 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f909523c-3c43-495a-ad45-81fcc03445fb-config\") pod \"route-controller-manager-6d7787d89-s9n5w\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.698177 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-proxy-ca-bundles\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.698212 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-config\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.698538 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f909523c-3c43-495a-ad45-81fcc03445fb-client-ca\") pod \"route-controller-manager-6d7787d89-s9n5w\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.701463 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/437026fe-8376-43fa-899e-4cd0f25468c0-serving-cert\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.701897 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f909523c-3c43-495a-ad45-81fcc03445fb-serving-cert\") pod \"route-controller-manager-6d7787d89-s9n5w\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.712100 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgmsn\" (UniqueName: \"kubernetes.io/projected/f909523c-3c43-495a-ad45-81fcc03445fb-kube-api-access-wgmsn\") pod \"route-controller-manager-6d7787d89-s9n5w\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.712404 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d828f\" (UniqueName: \"kubernetes.io/projected/437026fe-8376-43fa-899e-4cd0f25468c0-kube-api-access-d828f\") pod \"controller-manager-7556c9649b-qttmn\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.891556 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:24 crc kubenswrapper[4756]: I0318 14:04:24.903050 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.076753 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w"] Mar 18 14:04:25 crc kubenswrapper[4756]: W0318 14:04:25.084913 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf909523c_3c43_495a_ad45_81fcc03445fb.slice/crio-b61a7a648dc97c35dfa435c97667c1fd975b55a4ec4bfef28bbf916ba009e38f WatchSource:0}: Error finding container b61a7a648dc97c35dfa435c97667c1fd975b55a4ec4bfef28bbf916ba009e38f: Status 404 returned error can't find the container with id b61a7a648dc97c35dfa435c97667c1fd975b55a4ec4bfef28bbf916ba009e38f Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.104356 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.105500 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.107609 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.108018 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.108016 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.129351 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7556c9649b-qttmn"] Mar 18 14:04:25 crc kubenswrapper[4756]: W0318 14:04:25.140467 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437026fe_8376_43fa_899e_4cd0f25468c0.slice/crio-0492a4735d3c9c492ff51fa67aac1fbfbfc931823aa2dfe3c5f20cb6c7df8e57 WatchSource:0}: Error finding container 0492a4735d3c9c492ff51fa67aac1fbfbfc931823aa2dfe3c5f20cb6c7df8e57: Status 404 returned error can't find the container with id 0492a4735d3c9c492ff51fa67aac1fbfbfc931823aa2dfe3c5f20cb6c7df8e57 Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.203742 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/356d2f47-a922-458e-8578-a79ee650e100-kube-api-access\") pod \"installer-9-crc\" (UID: \"356d2f47-a922-458e-8578-a79ee650e100\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.203810 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/356d2f47-a922-458e-8578-a79ee650e100-kubelet-dir\") pod \"installer-9-crc\" (UID: \"356d2f47-a922-458e-8578-a79ee650e100\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.203891 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/356d2f47-a922-458e-8578-a79ee650e100-var-lock\") pod \"installer-9-crc\" (UID: \"356d2f47-a922-458e-8578-a79ee650e100\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.305456 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/356d2f47-a922-458e-8578-a79ee650e100-var-lock\") pod \"installer-9-crc\" (UID: \"356d2f47-a922-458e-8578-a79ee650e100\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.305564 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/356d2f47-a922-458e-8578-a79ee650e100-kube-api-access\") pod \"installer-9-crc\" (UID: \"356d2f47-a922-458e-8578-a79ee650e100\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.305602 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/356d2f47-a922-458e-8578-a79ee650e100-kubelet-dir\") pod \"installer-9-crc\" (UID: \"356d2f47-a922-458e-8578-a79ee650e100\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.305616 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/356d2f47-a922-458e-8578-a79ee650e100-var-lock\") pod \"installer-9-crc\" (UID: \"356d2f47-a922-458e-8578-a79ee650e100\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.305834 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/356d2f47-a922-458e-8578-a79ee650e100-kubelet-dir\") pod \"installer-9-crc\" (UID: \"356d2f47-a922-458e-8578-a79ee650e100\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.329738 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/356d2f47-a922-458e-8578-a79ee650e100-kube-api-access\") pod \"installer-9-crc\" (UID: \"356d2f47-a922-458e-8578-a79ee650e100\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.423049 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.849023 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.913251 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" event={"ID":"f909523c-3c43-495a-ad45-81fcc03445fb","Type":"ContainerStarted","Data":"e9b4fb3e6ca901ff81f75d712c5d438206918648691a2dbd67c17982864b1d0c"} Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.913924 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.914014 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" event={"ID":"f909523c-3c43-495a-ad45-81fcc03445fb","Type":"ContainerStarted","Data":"b61a7a648dc97c35dfa435c97667c1fd975b55a4ec4bfef28bbf916ba009e38f"} Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.914372 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"356d2f47-a922-458e-8578-a79ee650e100","Type":"ContainerStarted","Data":"3d2ceb5e165da57bfb6d3d52028651bd70ffe7e7ba51a707dbd268a4911957df"} Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.915826 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" event={"ID":"437026fe-8376-43fa-899e-4cd0f25468c0","Type":"ContainerStarted","Data":"ba57e8e2ced5ec112e0a88e244ef160bccad21062eba11ac155b0c7c64cfbc28"} Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.915869 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" event={"ID":"437026fe-8376-43fa-899e-4cd0f25468c0","Type":"ContainerStarted","Data":"0492a4735d3c9c492ff51fa67aac1fbfbfc931823aa2dfe3c5f20cb6c7df8e57"} Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.916029 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.919758 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.927701 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" podStartSLOduration=5.927680756 podStartE2EDuration="5.927680756s" podCreationTimestamp="2026-03-18 14:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:04:25.925977401 +0000 UTC m=+267.240395386" watchObservedRunningTime="2026-03-18 14:04:25.927680756 +0000 UTC m=+267.242098731" Mar 18 14:04:25 crc kubenswrapper[4756]: I0318 14:04:25.947063 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" podStartSLOduration=5.947040686 podStartE2EDuration="5.947040686s" podCreationTimestamp="2026-03-18 14:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:04:25.944104397 +0000 UTC m=+267.258522382" watchObservedRunningTime="2026-03-18 14:04:25.947040686 +0000 UTC m=+267.261458661" Mar 18 14:04:26 crc kubenswrapper[4756]: I0318 14:04:26.097911 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:26 crc kubenswrapper[4756]: I0318 14:04:26.928999 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"356d2f47-a922-458e-8578-a79ee650e100","Type":"ContainerStarted","Data":"5358e7736ff09b1c3b89f9788bef458ec6a748c78830cdd8ec71e6314feda929"} Mar 18 14:04:26 crc kubenswrapper[4756]: I0318 14:04:26.946404 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.946382397 podStartE2EDuration="1.946382397s" podCreationTimestamp="2026-03-18 14:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:04:26.945367309 +0000 UTC m=+268.259785284" watchObservedRunningTime="2026-03-18 14:04:26.946382397 +0000 UTC m=+268.260800382" Mar 18 14:04:29 crc kubenswrapper[4756]: I0318 14:04:29.948582 4756 generic.go:334] "Generic (PLEG): container finished" podID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" containerID="1cd60a7d0e0fe8dbed18bbe5e6275c0e2aca0589fc12c6a7d370307325481da6" exitCode=0 Mar 18 14:04:29 crc kubenswrapper[4756]: I0318 14:04:29.948748 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfqzk" event={"ID":"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b","Type":"ContainerDied","Data":"1cd60a7d0e0fe8dbed18bbe5e6275c0e2aca0589fc12c6a7d370307325481da6"} Mar 18 14:04:29 crc kubenswrapper[4756]: I0318 14:04:29.951277 4756 generic.go:334] "Generic (PLEG): container finished" podID="7bb3189f-716d-4fef-b885-3a031a60d981" containerID="58699ce81b73c57dccbc44b3771f71b4b535826b137835d929dc4811ddc3d10d" exitCode=0 Mar 18 14:04:29 crc kubenswrapper[4756]: I0318 14:04:29.951336 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w28cp" event={"ID":"7bb3189f-716d-4fef-b885-3a031a60d981","Type":"ContainerDied","Data":"58699ce81b73c57dccbc44b3771f71b4b535826b137835d929dc4811ddc3d10d"} Mar 18 14:04:30 crc kubenswrapper[4756]: I0318 14:04:30.960805 4756 generic.go:334] "Generic (PLEG): container finished" podID="48602255-9809-498e-9c4a-6053ba5ff591" containerID="343794b8ee23454efd8544022f131672c0c1d1488a888229b905d6d4fc63200c" exitCode=0 Mar 18 14:04:30 crc kubenswrapper[4756]: I0318 14:04:30.960867 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkr5r" event={"ID":"48602255-9809-498e-9c4a-6053ba5ff591","Type":"ContainerDied","Data":"343794b8ee23454efd8544022f131672c0c1d1488a888229b905d6d4fc63200c"} Mar 18 14:04:30 crc kubenswrapper[4756]: I0318 14:04:30.965101 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfqzk" event={"ID":"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b","Type":"ContainerStarted","Data":"82b7eb6f15fa7036e73e1807efb2d8c87fec2807af4921175b3749c3621896c6"} Mar 18 14:04:30 crc kubenswrapper[4756]: I0318 14:04:30.968604 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w28cp" event={"ID":"7bb3189f-716d-4fef-b885-3a031a60d981","Type":"ContainerStarted","Data":"28d969f89913a9a030750c4a9dd024178f31c8822a56ad7c3e58adbfb440881a"} Mar 18 14:04:31 crc kubenswrapper[4756]: I0318 14:04:31.002393 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w28cp" podStartSLOduration=2.887601821 podStartE2EDuration="50.002378523s" podCreationTimestamp="2026-03-18 14:03:41 +0000 UTC" firstStartedPulling="2026-03-18 14:03:43.306877032 +0000 UTC m=+224.621294997" lastFinishedPulling="2026-03-18 14:04:30.421653724 +0000 UTC m=+271.736071699" observedRunningTime="2026-03-18 14:04:30.999172467 +0000 UTC m=+272.313590442" watchObservedRunningTime="2026-03-18 14:04:31.002378523 +0000 UTC m=+272.316796498" Mar 18 14:04:31 crc kubenswrapper[4756]: I0318 14:04:31.020862 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tfqzk" podStartSLOduration=2.877807688 podStartE2EDuration="50.020845437s" podCreationTimestamp="2026-03-18 14:03:41 +0000 UTC" firstStartedPulling="2026-03-18 14:03:43.320670752 +0000 UTC m=+224.635088727" lastFinishedPulling="2026-03-18 14:04:30.463708511 +0000 UTC m=+271.778126476" observedRunningTime="2026-03-18 14:04:31.01869118 +0000 UTC m=+272.333109155" watchObservedRunningTime="2026-03-18 14:04:31.020845437 +0000 UTC m=+272.335263422" Mar 18 14:04:31 crc kubenswrapper[4756]: I0318 14:04:31.830443 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:04:31 crc kubenswrapper[4756]: I0318 14:04:31.830730 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:04:31 crc kubenswrapper[4756]: I0318 14:04:31.926071 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:04:31 crc kubenswrapper[4756]: I0318 14:04:31.926196 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:04:31 crc kubenswrapper[4756]: I0318 14:04:31.976782 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkr5r" event={"ID":"48602255-9809-498e-9c4a-6053ba5ff591","Type":"ContainerStarted","Data":"510acfd7467e9c19579eba546928e683416362c816d3015cf2462f9f4c10361a"} Mar 18 14:04:32 crc kubenswrapper[4756]: I0318 14:04:32.334723 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lkr5r" podStartSLOduration=2.900945788 podStartE2EDuration="51.33470552s" podCreationTimestamp="2026-03-18 14:03:41 +0000 UTC" firstStartedPulling="2026-03-18 14:03:43.305469065 +0000 UTC m=+224.619887040" lastFinishedPulling="2026-03-18 14:04:31.739228797 +0000 UTC m=+273.053646772" observedRunningTime="2026-03-18 14:04:31.995717382 +0000 UTC m=+273.310135367" watchObservedRunningTime="2026-03-18 14:04:32.33470552 +0000 UTC m=+273.649123495" Mar 18 14:04:32 crc kubenswrapper[4756]: I0318 14:04:32.873927 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-w28cp" podUID="7bb3189f-716d-4fef-b885-3a031a60d981" containerName="registry-server" probeResult="failure" output=< Mar 18 14:04:32 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:04:32 crc kubenswrapper[4756]: > Mar 18 14:04:32 crc kubenswrapper[4756]: I0318 14:04:32.961307 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-tfqzk" podUID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" containerName="registry-server" probeResult="failure" output=< Mar 18 14:04:32 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:04:32 crc kubenswrapper[4756]: > Mar 18 14:04:32 crc kubenswrapper[4756]: I0318 14:04:32.985843 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs6l5" event={"ID":"af462049-61c3-4da5-aeb0-0311404c4741","Type":"ContainerStarted","Data":"62108441e5f5afcb75d9fb7396ed2e8fe0142b603a6e187357f391872fc6e3a2"} Mar 18 14:04:32 crc kubenswrapper[4756]: I0318 14:04:32.997014 4756 generic.go:334] "Generic (PLEG): container finished" podID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" containerID="68f415a7bff7da4a1e2e81d8c1869181fa4e0f4714b79f5e10aec089fb533051" exitCode=0 Mar 18 14:04:32 crc kubenswrapper[4756]: I0318 14:04:32.997124 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jmwzc" event={"ID":"c76033c1-1ccb-42ce-ade9-f46428bc0b46","Type":"ContainerDied","Data":"68f415a7bff7da4a1e2e81d8c1869181fa4e0f4714b79f5e10aec089fb533051"} Mar 18 14:04:34 crc kubenswrapper[4756]: I0318 14:04:34.003916 4756 generic.go:334] "Generic (PLEG): container finished" podID="af462049-61c3-4da5-aeb0-0311404c4741" containerID="62108441e5f5afcb75d9fb7396ed2e8fe0142b603a6e187357f391872fc6e3a2" exitCode=0 Mar 18 14:04:34 crc kubenswrapper[4756]: I0318 14:04:34.003976 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs6l5" event={"ID":"af462049-61c3-4da5-aeb0-0311404c4741","Type":"ContainerDied","Data":"62108441e5f5afcb75d9fb7396ed2e8fe0142b603a6e187357f391872fc6e3a2"} Mar 18 14:04:34 crc kubenswrapper[4756]: I0318 14:04:34.089423 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:04:35 crc kubenswrapper[4756]: I0318 14:04:35.955150 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhpzd"] Mar 18 14:04:35 crc kubenswrapper[4756]: I0318 14:04:35.955831 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dhpzd" podUID="604a2bca-c232-4242-a46c-31630b85585d" containerName="registry-server" containerID="cri-o://7fae7af2af82ed351fcb21acf573f2ec73c1c5d4f72176108f1969b485859a2f" gracePeriod=2 Mar 18 14:04:36 crc kubenswrapper[4756]: I0318 14:04:36.915847 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:04:36 crc kubenswrapper[4756]: I0318 14:04:36.916249 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.024555 4756 generic.go:334] "Generic (PLEG): container finished" podID="604a2bca-c232-4242-a46c-31630b85585d" containerID="7fae7af2af82ed351fcb21acf573f2ec73c1c5d4f72176108f1969b485859a2f" exitCode=0 Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.024657 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhpzd" event={"ID":"604a2bca-c232-4242-a46c-31630b85585d","Type":"ContainerDied","Data":"7fae7af2af82ed351fcb21acf573f2ec73c1c5d4f72176108f1969b485859a2f"} Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.026380 4756 generic.go:334] "Generic (PLEG): container finished" podID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" containerID="9042f3a5c3deb9daa509093574c1c70fbb32e9ebd8f3c973883b1160e76ae43c" exitCode=0 Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.026419 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hrhl" event={"ID":"caee1439-b7bb-456e-982f-1c3c3cdb51c3","Type":"ContainerDied","Data":"9042f3a5c3deb9daa509093574c1c70fbb32e9ebd8f3c973883b1160e76ae43c"} Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.348542 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.456872 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a2bca-c232-4242-a46c-31630b85585d-catalog-content\") pod \"604a2bca-c232-4242-a46c-31630b85585d\" (UID: \"604a2bca-c232-4242-a46c-31630b85585d\") " Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.456953 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlvf7\" (UniqueName: \"kubernetes.io/projected/604a2bca-c232-4242-a46c-31630b85585d-kube-api-access-xlvf7\") pod \"604a2bca-c232-4242-a46c-31630b85585d\" (UID: \"604a2bca-c232-4242-a46c-31630b85585d\") " Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.457023 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a2bca-c232-4242-a46c-31630b85585d-utilities\") pod \"604a2bca-c232-4242-a46c-31630b85585d\" (UID: \"604a2bca-c232-4242-a46c-31630b85585d\") " Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.457689 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/604a2bca-c232-4242-a46c-31630b85585d-utilities" (OuterVolumeSpecName: "utilities") pod "604a2bca-c232-4242-a46c-31630b85585d" (UID: "604a2bca-c232-4242-a46c-31630b85585d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.462484 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604a2bca-c232-4242-a46c-31630b85585d-kube-api-access-xlvf7" (OuterVolumeSpecName: "kube-api-access-xlvf7") pod "604a2bca-c232-4242-a46c-31630b85585d" (UID: "604a2bca-c232-4242-a46c-31630b85585d"). InnerVolumeSpecName "kube-api-access-xlvf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.487059 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/604a2bca-c232-4242-a46c-31630b85585d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "604a2bca-c232-4242-a46c-31630b85585d" (UID: "604a2bca-c232-4242-a46c-31630b85585d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.558417 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a2bca-c232-4242-a46c-31630b85585d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.558708 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlvf7\" (UniqueName: \"kubernetes.io/projected/604a2bca-c232-4242-a46c-31630b85585d-kube-api-access-xlvf7\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:37 crc kubenswrapper[4756]: I0318 14:04:37.558721 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a2bca-c232-4242-a46c-31630b85585d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:38 crc kubenswrapper[4756]: I0318 14:04:38.035268 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhpzd" event={"ID":"604a2bca-c232-4242-a46c-31630b85585d","Type":"ContainerDied","Data":"99232e276a56d718927bb756b17f640904c229559b9bedc1aacd60bbdc8f251b"} Mar 18 14:04:38 crc kubenswrapper[4756]: I0318 14:04:38.035302 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhpzd" Mar 18 14:04:38 crc kubenswrapper[4756]: I0318 14:04:38.035354 4756 scope.go:117] "RemoveContainer" containerID="7fae7af2af82ed351fcb21acf573f2ec73c1c5d4f72176108f1969b485859a2f" Mar 18 14:04:38 crc kubenswrapper[4756]: I0318 14:04:38.061405 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhpzd"] Mar 18 14:04:38 crc kubenswrapper[4756]: I0318 14:04:38.063882 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhpzd"] Mar 18 14:04:38 crc kubenswrapper[4756]: I0318 14:04:38.793763 4756 scope.go:117] "RemoveContainer" containerID="3ae0906fece88bf293fd0af9440c5685856af18e188f41744bd430dc55c90e99" Mar 18 14:04:39 crc kubenswrapper[4756]: I0318 14:04:39.043520 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs6l5" event={"ID":"af462049-61c3-4da5-aeb0-0311404c4741","Type":"ContainerStarted","Data":"0eca24fe5bac3c631ba88bd97407443f30d61316ccb50ad843025b68bf12b372"} Mar 18 14:04:39 crc kubenswrapper[4756]: I0318 14:04:39.067388 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gs6l5" podStartSLOduration=4.097785092 podStartE2EDuration="55.067372553s" podCreationTimestamp="2026-03-18 14:03:44 +0000 UTC" firstStartedPulling="2026-03-18 14:03:46.584524064 +0000 UTC m=+227.898942039" lastFinishedPulling="2026-03-18 14:04:37.554111515 +0000 UTC m=+278.868529500" observedRunningTime="2026-03-18 14:04:39.065730629 +0000 UTC m=+280.380148604" watchObservedRunningTime="2026-03-18 14:04:39.067372553 +0000 UTC m=+280.381790528" Mar 18 14:04:39 crc kubenswrapper[4756]: I0318 14:04:39.322413 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604a2bca-c232-4242-a46c-31630b85585d" path="/var/lib/kubelet/pods/604a2bca-c232-4242-a46c-31630b85585d/volumes" Mar 18 14:04:40 crc kubenswrapper[4756]: I0318 14:04:39.360661 4756 scope.go:117] "RemoveContainer" containerID="bbd90c10f3d1f4bb42cacc760659b0825df4bb8e1b58616fc88f4e3eff40dd95" Mar 18 14:04:40 crc kubenswrapper[4756]: I0318 14:04:40.573689 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7556c9649b-qttmn"] Mar 18 14:04:40 crc kubenswrapper[4756]: I0318 14:04:40.573942 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" podUID="437026fe-8376-43fa-899e-4cd0f25468c0" containerName="controller-manager" containerID="cri-o://ba57e8e2ced5ec112e0a88e244ef160bccad21062eba11ac155b0c7c64cfbc28" gracePeriod=30 Mar 18 14:04:40 crc kubenswrapper[4756]: I0318 14:04:40.578069 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w"] Mar 18 14:04:40 crc kubenswrapper[4756]: I0318 14:04:40.578348 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" podUID="f909523c-3c43-495a-ad45-81fcc03445fb" containerName="route-controller-manager" containerID="cri-o://e9b4fb3e6ca901ff81f75d712c5d438206918648691a2dbd67c17982864b1d0c" gracePeriod=30 Mar 18 14:04:41 crc kubenswrapper[4756]: I0318 14:04:41.069249 4756 generic.go:334] "Generic (PLEG): container finished" podID="f909523c-3c43-495a-ad45-81fcc03445fb" containerID="e9b4fb3e6ca901ff81f75d712c5d438206918648691a2dbd67c17982864b1d0c" exitCode=0 Mar 18 14:04:41 crc kubenswrapper[4756]: I0318 14:04:41.069357 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" event={"ID":"f909523c-3c43-495a-ad45-81fcc03445fb","Type":"ContainerDied","Data":"e9b4fb3e6ca901ff81f75d712c5d438206918648691a2dbd67c17982864b1d0c"} Mar 18 14:04:41 crc kubenswrapper[4756]: I0318 14:04:41.479268 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:04:41 crc kubenswrapper[4756]: I0318 14:04:41.479333 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:04:41 crc kubenswrapper[4756]: I0318 14:04:41.517544 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:04:41 crc kubenswrapper[4756]: I0318 14:04:41.881209 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:04:41 crc kubenswrapper[4756]: I0318 14:04:41.923631 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:04:41 crc kubenswrapper[4756]: I0318 14:04:41.977312 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.008681 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.018751 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f909523c-3c43-495a-ad45-81fcc03445fb-client-ca\") pod \"f909523c-3c43-495a-ad45-81fcc03445fb\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.018792 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f909523c-3c43-495a-ad45-81fcc03445fb-serving-cert\") pod \"f909523c-3c43-495a-ad45-81fcc03445fb\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.018821 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgmsn\" (UniqueName: \"kubernetes.io/projected/f909523c-3c43-495a-ad45-81fcc03445fb-kube-api-access-wgmsn\") pod \"f909523c-3c43-495a-ad45-81fcc03445fb\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.018851 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f909523c-3c43-495a-ad45-81fcc03445fb-config\") pod \"f909523c-3c43-495a-ad45-81fcc03445fb\" (UID: \"f909523c-3c43-495a-ad45-81fcc03445fb\") " Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.019440 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f909523c-3c43-495a-ad45-81fcc03445fb-client-ca" (OuterVolumeSpecName: "client-ca") pod "f909523c-3c43-495a-ad45-81fcc03445fb" (UID: "f909523c-3c43-495a-ad45-81fcc03445fb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.020518 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f909523c-3c43-495a-ad45-81fcc03445fb-config" (OuterVolumeSpecName: "config") pod "f909523c-3c43-495a-ad45-81fcc03445fb" (UID: "f909523c-3c43-495a-ad45-81fcc03445fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.022597 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.023897 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f909523c-3c43-495a-ad45-81fcc03445fb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f909523c-3c43-495a-ad45-81fcc03445fb" (UID: "f909523c-3c43-495a-ad45-81fcc03445fb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.026868 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f909523c-3c43-495a-ad45-81fcc03445fb-kube-api-access-wgmsn" (OuterVolumeSpecName: "kube-api-access-wgmsn") pod "f909523c-3c43-495a-ad45-81fcc03445fb" (UID: "f909523c-3c43-495a-ad45-81fcc03445fb"). InnerVolumeSpecName "kube-api-access-wgmsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.039474 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6"] Mar 18 14:04:42 crc kubenswrapper[4756]: E0318 14:04:42.039704 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604a2bca-c232-4242-a46c-31630b85585d" containerName="extract-content" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.039722 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="604a2bca-c232-4242-a46c-31630b85585d" containerName="extract-content" Mar 18 14:04:42 crc kubenswrapper[4756]: E0318 14:04:42.039732 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f909523c-3c43-495a-ad45-81fcc03445fb" containerName="route-controller-manager" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.039740 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f909523c-3c43-495a-ad45-81fcc03445fb" containerName="route-controller-manager" Mar 18 14:04:42 crc kubenswrapper[4756]: E0318 14:04:42.039754 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604a2bca-c232-4242-a46c-31630b85585d" containerName="registry-server" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.039760 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="604a2bca-c232-4242-a46c-31630b85585d" containerName="registry-server" Mar 18 14:04:42 crc kubenswrapper[4756]: E0318 14:04:42.039772 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604a2bca-c232-4242-a46c-31630b85585d" containerName="extract-utilities" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.039778 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="604a2bca-c232-4242-a46c-31630b85585d" containerName="extract-utilities" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.040061 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="604a2bca-c232-4242-a46c-31630b85585d" containerName="registry-server" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.040076 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f909523c-3c43-495a-ad45-81fcc03445fb" containerName="route-controller-manager" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.040467 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.051068 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6"] Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.086636 4756 generic.go:334] "Generic (PLEG): container finished" podID="437026fe-8376-43fa-899e-4cd0f25468c0" containerID="ba57e8e2ced5ec112e0a88e244ef160bccad21062eba11ac155b0c7c64cfbc28" exitCode=0 Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.086701 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" event={"ID":"437026fe-8376-43fa-899e-4cd0f25468c0","Type":"ContainerDied","Data":"ba57e8e2ced5ec112e0a88e244ef160bccad21062eba11ac155b0c7c64cfbc28"} Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.088511 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.096404 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w" event={"ID":"f909523c-3c43-495a-ad45-81fcc03445fb","Type":"ContainerDied","Data":"b61a7a648dc97c35dfa435c97667c1fd975b55a4ec4bfef28bbf916ba009e38f"} Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.096434 4756 scope.go:117] "RemoveContainer" containerID="e9b4fb3e6ca901ff81f75d712c5d438206918648691a2dbd67c17982864b1d0c" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.120952 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794af0e9-709b-4cbd-9d8e-a5b4921a8363-config\") pod \"route-controller-manager-595d8b7464-xlmw6\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.121072 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j54z\" (UniqueName: \"kubernetes.io/projected/794af0e9-709b-4cbd-9d8e-a5b4921a8363-kube-api-access-2j54z\") pod \"route-controller-manager-595d8b7464-xlmw6\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.121099 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/794af0e9-709b-4cbd-9d8e-a5b4921a8363-serving-cert\") pod \"route-controller-manager-595d8b7464-xlmw6\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.121198 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/794af0e9-709b-4cbd-9d8e-a5b4921a8363-client-ca\") pod \"route-controller-manager-595d8b7464-xlmw6\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.121254 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f909523c-3c43-495a-ad45-81fcc03445fb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.121267 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f909523c-3c43-495a-ad45-81fcc03445fb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.121279 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgmsn\" (UniqueName: \"kubernetes.io/projected/f909523c-3c43-495a-ad45-81fcc03445fb-kube-api-access-wgmsn\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.121291 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f909523c-3c43-495a-ad45-81fcc03445fb-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.131706 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w"] Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.134700 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d7787d89-s9n5w"] Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.139563 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.221990 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794af0e9-709b-4cbd-9d8e-a5b4921a8363-config\") pod \"route-controller-manager-595d8b7464-xlmw6\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.222074 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j54z\" (UniqueName: \"kubernetes.io/projected/794af0e9-709b-4cbd-9d8e-a5b4921a8363-kube-api-access-2j54z\") pod \"route-controller-manager-595d8b7464-xlmw6\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.222105 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/794af0e9-709b-4cbd-9d8e-a5b4921a8363-serving-cert\") pod \"route-controller-manager-595d8b7464-xlmw6\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.222202 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/794af0e9-709b-4cbd-9d8e-a5b4921a8363-client-ca\") pod \"route-controller-manager-595d8b7464-xlmw6\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.223202 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/794af0e9-709b-4cbd-9d8e-a5b4921a8363-client-ca\") pod \"route-controller-manager-595d8b7464-xlmw6\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.223346 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794af0e9-709b-4cbd-9d8e-a5b4921a8363-config\") pod \"route-controller-manager-595d8b7464-xlmw6\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.228013 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/794af0e9-709b-4cbd-9d8e-a5b4921a8363-serving-cert\") pod \"route-controller-manager-595d8b7464-xlmw6\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.240106 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j54z\" (UniqueName: \"kubernetes.io/projected/794af0e9-709b-4cbd-9d8e-a5b4921a8363-kube-api-access-2j54z\") pod \"route-controller-manager-595d8b7464-xlmw6\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:42 crc kubenswrapper[4756]: I0318 14:04:42.355804 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.099498 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jmwzc" event={"ID":"c76033c1-1ccb-42ce-ade9-f46428bc0b46","Type":"ContainerStarted","Data":"d7cb03e9d43784ce433c55623944799c1fce8a16edf40d27511647139d316110"} Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.101579 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn8ht" event={"ID":"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55","Type":"ContainerStarted","Data":"e0db8b8042435eed78ead95d40ac39734562e459578ba26e4f0f40e604c3afc8"} Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.326721 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f909523c-3c43-495a-ad45-81fcc03445fb" path="/var/lib/kubelet/pods/f909523c-3c43-495a-ad45-81fcc03445fb/volumes" Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.398798 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.541756 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-client-ca\") pod \"437026fe-8376-43fa-899e-4cd0f25468c0\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.542100 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-proxy-ca-bundles\") pod \"437026fe-8376-43fa-899e-4cd0f25468c0\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.542160 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/437026fe-8376-43fa-899e-4cd0f25468c0-serving-cert\") pod \"437026fe-8376-43fa-899e-4cd0f25468c0\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.542188 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-config\") pod \"437026fe-8376-43fa-899e-4cd0f25468c0\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.542218 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d828f\" (UniqueName: \"kubernetes.io/projected/437026fe-8376-43fa-899e-4cd0f25468c0-kube-api-access-d828f\") pod \"437026fe-8376-43fa-899e-4cd0f25468c0\" (UID: \"437026fe-8376-43fa-899e-4cd0f25468c0\") " Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.542956 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "437026fe-8376-43fa-899e-4cd0f25468c0" (UID: "437026fe-8376-43fa-899e-4cd0f25468c0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.543010 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-config" (OuterVolumeSpecName: "config") pod "437026fe-8376-43fa-899e-4cd0f25468c0" (UID: "437026fe-8376-43fa-899e-4cd0f25468c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.543807 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-client-ca" (OuterVolumeSpecName: "client-ca") pod "437026fe-8376-43fa-899e-4cd0f25468c0" (UID: "437026fe-8376-43fa-899e-4cd0f25468c0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.551372 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437026fe-8376-43fa-899e-4cd0f25468c0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "437026fe-8376-43fa-899e-4cd0f25468c0" (UID: "437026fe-8376-43fa-899e-4cd0f25468c0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.555563 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/437026fe-8376-43fa-899e-4cd0f25468c0-kube-api-access-d828f" (OuterVolumeSpecName: "kube-api-access-d828f") pod "437026fe-8376-43fa-899e-4cd0f25468c0" (UID: "437026fe-8376-43fa-899e-4cd0f25468c0"). InnerVolumeSpecName "kube-api-access-d828f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.561029 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tfqzk"] Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.561382 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tfqzk" podUID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" containerName="registry-server" containerID="cri-o://82b7eb6f15fa7036e73e1807efb2d8c87fec2807af4921175b3749c3621896c6" gracePeriod=2 Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.617030 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6"] Mar 18 14:04:43 crc kubenswrapper[4756]: W0318 14:04:43.629556 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod794af0e9_709b_4cbd_9d8e_a5b4921a8363.slice/crio-0af51f0654295299bf82bbf5e7e824ce04b4b939f68c9893ea0cfe303de1b3ea WatchSource:0}: Error finding container 0af51f0654295299bf82bbf5e7e824ce04b4b939f68c9893ea0cfe303de1b3ea: Status 404 returned error can't find the container with id 0af51f0654295299bf82bbf5e7e824ce04b4b939f68c9893ea0cfe303de1b3ea Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.643389 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.643422 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/437026fe-8376-43fa-899e-4cd0f25468c0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.643432 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.643443 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d828f\" (UniqueName: \"kubernetes.io/projected/437026fe-8376-43fa-899e-4cd0f25468c0-kube-api-access-d828f\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:43 crc kubenswrapper[4756]: I0318 14:04:43.643453 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/437026fe-8376-43fa-899e-4cd0f25468c0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.109875 4756 generic.go:334] "Generic (PLEG): container finished" podID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" containerID="e0db8b8042435eed78ead95d40ac39734562e459578ba26e4f0f40e604c3afc8" exitCode=0 Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.109965 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn8ht" event={"ID":"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55","Type":"ContainerDied","Data":"e0db8b8042435eed78ead95d40ac39734562e459578ba26e4f0f40e604c3afc8"} Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.112947 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hrhl" event={"ID":"caee1439-b7bb-456e-982f-1c3c3cdb51c3","Type":"ContainerStarted","Data":"e26f416eeb1167ba699cba6d84e1fedae84f98109849c081a527ee30b31a601c"} Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.115062 4756 generic.go:334] "Generic (PLEG): container finished" podID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" containerID="82b7eb6f15fa7036e73e1807efb2d8c87fec2807af4921175b3749c3621896c6" exitCode=0 Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.115132 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfqzk" event={"ID":"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b","Type":"ContainerDied","Data":"82b7eb6f15fa7036e73e1807efb2d8c87fec2807af4921175b3749c3621896c6"} Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.116512 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" event={"ID":"794af0e9-709b-4cbd-9d8e-a5b4921a8363","Type":"ContainerStarted","Data":"214782fec7483d6da42a8379027bb4f4b9c0d5618a319d560c0e5367a5195681"} Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.116538 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" event={"ID":"794af0e9-709b-4cbd-9d8e-a5b4921a8363","Type":"ContainerStarted","Data":"0af51f0654295299bf82bbf5e7e824ce04b4b939f68c9893ea0cfe303de1b3ea"} Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.116812 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.117733 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.118013 4756 patch_prober.go:28] interesting pod/route-controller-manager-595d8b7464-xlmw6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.118055 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" podUID="794af0e9-709b-4cbd-9d8e-a5b4921a8363" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.118098 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7556c9649b-qttmn" event={"ID":"437026fe-8376-43fa-899e-4cd0f25468c0","Type":"ContainerDied","Data":"0492a4735d3c9c492ff51fa67aac1fbfbfc931823aa2dfe3c5f20cb6c7df8e57"} Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.118139 4756 scope.go:117] "RemoveContainer" containerID="ba57e8e2ced5ec112e0a88e244ef160bccad21062eba11ac155b0c7c64cfbc28" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.158351 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9hrhl" podStartSLOduration=2.237575509 podStartE2EDuration="1m1.158335005s" podCreationTimestamp="2026-03-18 14:03:43 +0000 UTC" firstStartedPulling="2026-03-18 14:03:44.412776468 +0000 UTC m=+225.727194443" lastFinishedPulling="2026-03-18 14:04:43.333535964 +0000 UTC m=+284.647953939" observedRunningTime="2026-03-18 14:04:44.156162627 +0000 UTC m=+285.470580612" watchObservedRunningTime="2026-03-18 14:04:44.158335005 +0000 UTC m=+285.472752980" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.179091 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jmwzc" podStartSLOduration=5.995534387 podStartE2EDuration="1m3.179071461s" podCreationTimestamp="2026-03-18 14:03:41 +0000 UTC" firstStartedPulling="2026-03-18 14:03:44.413473036 +0000 UTC m=+225.727891011" lastFinishedPulling="2026-03-18 14:04:41.59701011 +0000 UTC m=+282.911428085" observedRunningTime="2026-03-18 14:04:44.175976208 +0000 UTC m=+285.490394193" watchObservedRunningTime="2026-03-18 14:04:44.179071461 +0000 UTC m=+285.493489446" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.215106 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" podStartSLOduration=4.215087996 podStartE2EDuration="4.215087996s" podCreationTimestamp="2026-03-18 14:04:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:04:44.200206068 +0000 UTC m=+285.514624053" watchObservedRunningTime="2026-03-18 14:04:44.215087996 +0000 UTC m=+285.529505971" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.217742 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7556c9649b-qttmn"] Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.220404 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7556c9649b-qttmn"] Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.565369 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.585965 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8"] Mar 18 14:04:44 crc kubenswrapper[4756]: E0318 14:04:44.586203 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" containerName="extract-utilities" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.586221 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" containerName="extract-utilities" Mar 18 14:04:44 crc kubenswrapper[4756]: E0318 14:04:44.586229 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" containerName="extract-content" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.586237 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" containerName="extract-content" Mar 18 14:04:44 crc kubenswrapper[4756]: E0318 14:04:44.586256 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437026fe-8376-43fa-899e-4cd0f25468c0" containerName="controller-manager" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.586263 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="437026fe-8376-43fa-899e-4cd0f25468c0" containerName="controller-manager" Mar 18 14:04:44 crc kubenswrapper[4756]: E0318 14:04:44.586270 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" containerName="registry-server" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.586276 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" containerName="registry-server" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.586362 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="437026fe-8376-43fa-899e-4cd0f25468c0" containerName="controller-manager" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.586372 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" containerName="registry-server" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.586756 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.589808 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.590502 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.590558 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.590613 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.591870 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.593074 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.598219 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.607317 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8"] Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.658623 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f81be97-aa68-40db-b435-59dc4773998d-serving-cert\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.658693 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7zjx\" (UniqueName: \"kubernetes.io/projected/2f81be97-aa68-40db-b435-59dc4773998d-kube-api-access-q7zjx\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.658751 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-config\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.658813 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-proxy-ca-bundles\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.658872 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-client-ca\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.710241 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.710297 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.759558 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-catalog-content\") pod \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\" (UID: \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\") " Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.759841 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-utilities\") pod \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\" (UID: \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\") " Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.759936 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzbbt\" (UniqueName: \"kubernetes.io/projected/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-kube-api-access-vzbbt\") pod \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\" (UID: \"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b\") " Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.760194 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f81be97-aa68-40db-b435-59dc4773998d-serving-cert\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.760224 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7zjx\" (UniqueName: \"kubernetes.io/projected/2f81be97-aa68-40db-b435-59dc4773998d-kube-api-access-q7zjx\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.760265 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-config\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.760337 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-proxy-ca-bundles\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.760451 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-client-ca\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.760463 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-utilities" (OuterVolumeSpecName: "utilities") pod "fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" (UID: "fb1afba2-1ba1-43c6-9a0c-740f6504fa8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.760523 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.761512 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-proxy-ca-bundles\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.761625 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-client-ca\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.762092 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-config\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.768532 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-kube-api-access-vzbbt" (OuterVolumeSpecName: "kube-api-access-vzbbt") pod "fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" (UID: "fb1afba2-1ba1-43c6-9a0c-740f6504fa8b"). InnerVolumeSpecName "kube-api-access-vzbbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.769780 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f81be97-aa68-40db-b435-59dc4773998d-serving-cert\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.778477 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7zjx\" (UniqueName: \"kubernetes.io/projected/2f81be97-aa68-40db-b435-59dc4773998d-kube-api-access-q7zjx\") pod \"controller-manager-5d5f9ff9d6-nt9q8\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.825677 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" (UID: "fb1afba2-1ba1-43c6-9a0c-740f6504fa8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.861212 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.861246 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzbbt\" (UniqueName: \"kubernetes.io/projected/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b-kube-api-access-vzbbt\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:44 crc kubenswrapper[4756]: I0318 14:04:44.908587 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.125994 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tfqzk" Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.126140 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tfqzk" event={"ID":"fb1afba2-1ba1-43c6-9a0c-740f6504fa8b","Type":"ContainerDied","Data":"af35670de8506cd4bf15639878f63745dde2272b4ce9b2c8ec7b61efd822045e"} Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.126513 4756 scope.go:117] "RemoveContainer" containerID="82b7eb6f15fa7036e73e1807efb2d8c87fec2807af4921175b3749c3621896c6" Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.139634 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn8ht" event={"ID":"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55","Type":"ContainerStarted","Data":"703905c8b61f86593cef4e862c6732d822a58a9e65de3442fba11edf0363a1af"} Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.146683 4756 scope.go:117] "RemoveContainer" containerID="1cd60a7d0e0fe8dbed18bbe5e6275c0e2aca0589fc12c6a7d370307325481da6" Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.147150 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.160999 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cn8ht" podStartSLOduration=3.082910815 podStartE2EDuration="1m1.160980865s" podCreationTimestamp="2026-03-18 14:03:44 +0000 UTC" firstStartedPulling="2026-03-18 14:03:46.581351499 +0000 UTC m=+227.895769474" lastFinishedPulling="2026-03-18 14:04:44.659421559 +0000 UTC m=+285.973839524" observedRunningTime="2026-03-18 14:04:45.156164726 +0000 UTC m=+286.470582701" watchObservedRunningTime="2026-03-18 14:04:45.160980865 +0000 UTC m=+286.475398840" Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.177314 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tfqzk"] Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.180459 4756 scope.go:117] "RemoveContainer" containerID="817d71c44fbce04b94845dbc4acca2b0e2404867dd1e608dbbd975a67d91af96" Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.186505 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tfqzk"] Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.302570 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8"] Mar 18 14:04:45 crc kubenswrapper[4756]: W0318 14:04:45.319089 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f81be97_aa68_40db_b435_59dc4773998d.slice/crio-b75a3a5839ad159a6da30752e0cd2d4c9c0b61f5c1b82b890bf6c806d6635ceb WatchSource:0}: Error finding container b75a3a5839ad159a6da30752e0cd2d4c9c0b61f5c1b82b890bf6c806d6635ceb: Status 404 returned error can't find the container with id b75a3a5839ad159a6da30752e0cd2d4c9c0b61f5c1b82b890bf6c806d6635ceb Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.324435 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="437026fe-8376-43fa-899e-4cd0f25468c0" path="/var/lib/kubelet/pods/437026fe-8376-43fa-899e-4cd0f25468c0/volumes" Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.325079 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1afba2-1ba1-43c6-9a0c-740f6504fa8b" path="/var/lib/kubelet/pods/fb1afba2-1ba1-43c6-9a0c-740f6504fa8b/volumes" Mar 18 14:04:45 crc kubenswrapper[4756]: I0318 14:04:45.754565 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gs6l5" podUID="af462049-61c3-4da5-aeb0-0311404c4741" containerName="registry-server" probeResult="failure" output=< Mar 18 14:04:45 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:04:45 crc kubenswrapper[4756]: > Mar 18 14:04:46 crc kubenswrapper[4756]: I0318 14:04:46.146731 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" event={"ID":"2f81be97-aa68-40db-b435-59dc4773998d","Type":"ContainerStarted","Data":"4de387cdeb92616cb90f321e40421fa1f0381d7b5076fd57856d705b32035614"} Mar 18 14:04:46 crc kubenswrapper[4756]: I0318 14:04:46.146782 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" event={"ID":"2f81be97-aa68-40db-b435-59dc4773998d","Type":"ContainerStarted","Data":"b75a3a5839ad159a6da30752e0cd2d4c9c0b61f5c1b82b890bf6c806d6635ceb"} Mar 18 14:04:46 crc kubenswrapper[4756]: I0318 14:04:46.147249 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:46 crc kubenswrapper[4756]: I0318 14:04:46.153688 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:04:46 crc kubenswrapper[4756]: I0318 14:04:46.170254 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" podStartSLOduration=6.170236871 podStartE2EDuration="6.170236871s" podCreationTimestamp="2026-03-18 14:04:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:04:46.167319233 +0000 UTC m=+287.481737248" watchObservedRunningTime="2026-03-18 14:04:46.170236871 +0000 UTC m=+287.484654846" Mar 18 14:04:52 crc kubenswrapper[4756]: I0318 14:04:52.119320 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:04:52 crc kubenswrapper[4756]: I0318 14:04:52.120616 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:04:52 crc kubenswrapper[4756]: I0318 14:04:52.165288 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:04:52 crc kubenswrapper[4756]: I0318 14:04:52.238037 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:04:52 crc kubenswrapper[4756]: I0318 14:04:52.415936 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jmwzc"] Mar 18 14:04:53 crc kubenswrapper[4756]: I0318 14:04:53.651532 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:04:53 crc kubenswrapper[4756]: I0318 14:04:53.651789 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:04:53 crc kubenswrapper[4756]: I0318 14:04:53.722928 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:04:53 crc kubenswrapper[4756]: I0318 14:04:53.949692 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9sqqb"] Mar 18 14:04:54 crc kubenswrapper[4756]: I0318 14:04:54.190064 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jmwzc" podUID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" containerName="registry-server" containerID="cri-o://d7cb03e9d43784ce433c55623944799c1fce8a16edf40d27511647139d316110" gracePeriod=2 Mar 18 14:04:54 crc kubenswrapper[4756]: I0318 14:04:54.240509 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:04:54 crc kubenswrapper[4756]: I0318 14:04:54.762991 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:04:54 crc kubenswrapper[4756]: I0318 14:04:54.810629 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:04:55 crc kubenswrapper[4756]: I0318 14:04:55.049305 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:04:55 crc kubenswrapper[4756]: I0318 14:04:55.049357 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:04:55 crc kubenswrapper[4756]: I0318 14:04:55.112482 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:04:55 crc kubenswrapper[4756]: I0318 14:04:55.197704 4756 generic.go:334] "Generic (PLEG): container finished" podID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" containerID="d7cb03e9d43784ce433c55623944799c1fce8a16edf40d27511647139d316110" exitCode=0 Mar 18 14:04:55 crc kubenswrapper[4756]: I0318 14:04:55.197834 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jmwzc" event={"ID":"c76033c1-1ccb-42ce-ade9-f46428bc0b46","Type":"ContainerDied","Data":"d7cb03e9d43784ce433c55623944799c1fce8a16edf40d27511647139d316110"} Mar 18 14:04:55 crc kubenswrapper[4756]: I0318 14:04:55.233864 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:04:55 crc kubenswrapper[4756]: I0318 14:04:55.804987 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:04:55 crc kubenswrapper[4756]: I0318 14:04:55.917235 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gcxc\" (UniqueName: \"kubernetes.io/projected/c76033c1-1ccb-42ce-ade9-f46428bc0b46-kube-api-access-5gcxc\") pod \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\" (UID: \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\") " Mar 18 14:04:55 crc kubenswrapper[4756]: I0318 14:04:55.917609 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c76033c1-1ccb-42ce-ade9-f46428bc0b46-utilities\") pod \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\" (UID: \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\") " Mar 18 14:04:55 crc kubenswrapper[4756]: I0318 14:04:55.917763 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c76033c1-1ccb-42ce-ade9-f46428bc0b46-catalog-content\") pod \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\" (UID: \"c76033c1-1ccb-42ce-ade9-f46428bc0b46\") " Mar 18 14:04:55 crc kubenswrapper[4756]: I0318 14:04:55.918648 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c76033c1-1ccb-42ce-ade9-f46428bc0b46-utilities" (OuterVolumeSpecName: "utilities") pod "c76033c1-1ccb-42ce-ade9-f46428bc0b46" (UID: "c76033c1-1ccb-42ce-ade9-f46428bc0b46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:04:55 crc kubenswrapper[4756]: I0318 14:04:55.924630 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c76033c1-1ccb-42ce-ade9-f46428bc0b46-kube-api-access-5gcxc" (OuterVolumeSpecName: "kube-api-access-5gcxc") pod "c76033c1-1ccb-42ce-ade9-f46428bc0b46" (UID: "c76033c1-1ccb-42ce-ade9-f46428bc0b46"). InnerVolumeSpecName "kube-api-access-5gcxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:55 crc kubenswrapper[4756]: I0318 14:04:55.978387 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c76033c1-1ccb-42ce-ade9-f46428bc0b46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c76033c1-1ccb-42ce-ade9-f46428bc0b46" (UID: "c76033c1-1ccb-42ce-ade9-f46428bc0b46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:04:56 crc kubenswrapper[4756]: I0318 14:04:56.020002 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c76033c1-1ccb-42ce-ade9-f46428bc0b46-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:56 crc kubenswrapper[4756]: I0318 14:04:56.020062 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c76033c1-1ccb-42ce-ade9-f46428bc0b46-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:56 crc kubenswrapper[4756]: I0318 14:04:56.020090 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gcxc\" (UniqueName: \"kubernetes.io/projected/c76033c1-1ccb-42ce-ade9-f46428bc0b46-kube-api-access-5gcxc\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:56 crc kubenswrapper[4756]: I0318 14:04:56.220810 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jmwzc" event={"ID":"c76033c1-1ccb-42ce-ade9-f46428bc0b46","Type":"ContainerDied","Data":"25fbc5913f83b47d1ccd0fc00ffa34442cd0ee062fe9b6763d9988f46f2640bc"} Mar 18 14:04:56 crc kubenswrapper[4756]: I0318 14:04:56.220990 4756 scope.go:117] "RemoveContainer" containerID="d7cb03e9d43784ce433c55623944799c1fce8a16edf40d27511647139d316110" Mar 18 14:04:56 crc kubenswrapper[4756]: I0318 14:04:56.221355 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jmwzc" Mar 18 14:04:56 crc kubenswrapper[4756]: I0318 14:04:56.251584 4756 scope.go:117] "RemoveContainer" containerID="68f415a7bff7da4a1e2e81d8c1869181fa4e0f4714b79f5e10aec089fb533051" Mar 18 14:04:56 crc kubenswrapper[4756]: I0318 14:04:56.270608 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jmwzc"] Mar 18 14:04:56 crc kubenswrapper[4756]: I0318 14:04:56.273831 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jmwzc"] Mar 18 14:04:56 crc kubenswrapper[4756]: I0318 14:04:56.296394 4756 scope.go:117] "RemoveContainer" containerID="ff1c83ffdece8817475d2b00bc4d5840cb8ebe317ec27c721e3a19407dc3f3cf" Mar 18 14:04:57 crc kubenswrapper[4756]: I0318 14:04:57.322606 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" path="/var/lib/kubelet/pods/c76033c1-1ccb-42ce-ade9-f46428bc0b46/volumes" Mar 18 14:04:58 crc kubenswrapper[4756]: I0318 14:04:58.412939 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cn8ht"] Mar 18 14:04:58 crc kubenswrapper[4756]: I0318 14:04:58.413770 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cn8ht" podUID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" containerName="registry-server" containerID="cri-o://703905c8b61f86593cef4e862c6732d822a58a9e65de3442fba11edf0363a1af" gracePeriod=2 Mar 18 14:04:58 crc kubenswrapper[4756]: I0318 14:04:58.920956 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:04:58 crc kubenswrapper[4756]: I0318 14:04:58.958621 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-utilities\") pod \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\" (UID: \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\") " Mar 18 14:04:58 crc kubenswrapper[4756]: I0318 14:04:58.958718 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-catalog-content\") pod \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\" (UID: \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\") " Mar 18 14:04:58 crc kubenswrapper[4756]: I0318 14:04:58.958747 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt6kl\" (UniqueName: \"kubernetes.io/projected/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-kube-api-access-pt6kl\") pod \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\" (UID: \"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55\") " Mar 18 14:04:58 crc kubenswrapper[4756]: I0318 14:04:58.959473 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-utilities" (OuterVolumeSpecName: "utilities") pod "d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" (UID: "d5a6ae3a-3fd6-4254-9ca5-8654eee53b55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:04:58 crc kubenswrapper[4756]: I0318 14:04:58.964001 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-kube-api-access-pt6kl" (OuterVolumeSpecName: "kube-api-access-pt6kl") pod "d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" (UID: "d5a6ae3a-3fd6-4254-9ca5-8654eee53b55"). InnerVolumeSpecName "kube-api-access-pt6kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.059944 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.059970 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt6kl\" (UniqueName: \"kubernetes.io/projected/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-kube-api-access-pt6kl\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.110507 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" (UID: "d5a6ae3a-3fd6-4254-9ca5-8654eee53b55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.160845 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.241440 4756 generic.go:334] "Generic (PLEG): container finished" podID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" containerID="703905c8b61f86593cef4e862c6732d822a58a9e65de3442fba11edf0363a1af" exitCode=0 Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.241720 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn8ht" event={"ID":"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55","Type":"ContainerDied","Data":"703905c8b61f86593cef4e862c6732d822a58a9e65de3442fba11edf0363a1af"} Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.241800 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn8ht" event={"ID":"d5a6ae3a-3fd6-4254-9ca5-8654eee53b55","Type":"ContainerDied","Data":"f131c64ccd8383e1644568a192914c0c825e5cd2213507c4eee948653ab6d239"} Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.241885 4756 scope.go:117] "RemoveContainer" containerID="703905c8b61f86593cef4e862c6732d822a58a9e65de3442fba11edf0363a1af" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.242039 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn8ht" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.266701 4756 scope.go:117] "RemoveContainer" containerID="e0db8b8042435eed78ead95d40ac39734562e459578ba26e4f0f40e604c3afc8" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.275767 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cn8ht"] Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.280395 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cn8ht"] Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.294744 4756 scope.go:117] "RemoveContainer" containerID="ac26d600defb2938d1ab4c3bc8730842bb10051bb0050048102e489bcddf7cea" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.314235 4756 scope.go:117] "RemoveContainer" containerID="703905c8b61f86593cef4e862c6732d822a58a9e65de3442fba11edf0363a1af" Mar 18 14:04:59 crc kubenswrapper[4756]: E0318 14:04:59.315701 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703905c8b61f86593cef4e862c6732d822a58a9e65de3442fba11edf0363a1af\": container with ID starting with 703905c8b61f86593cef4e862c6732d822a58a9e65de3442fba11edf0363a1af not found: ID does not exist" containerID="703905c8b61f86593cef4e862c6732d822a58a9e65de3442fba11edf0363a1af" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.315748 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703905c8b61f86593cef4e862c6732d822a58a9e65de3442fba11edf0363a1af"} err="failed to get container status \"703905c8b61f86593cef4e862c6732d822a58a9e65de3442fba11edf0363a1af\": rpc error: code = NotFound desc = could not find container \"703905c8b61f86593cef4e862c6732d822a58a9e65de3442fba11edf0363a1af\": container with ID starting with 703905c8b61f86593cef4e862c6732d822a58a9e65de3442fba11edf0363a1af not found: ID does not exist" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.315774 4756 scope.go:117] "RemoveContainer" containerID="e0db8b8042435eed78ead95d40ac39734562e459578ba26e4f0f40e604c3afc8" Mar 18 14:04:59 crc kubenswrapper[4756]: E0318 14:04:59.316079 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0db8b8042435eed78ead95d40ac39734562e459578ba26e4f0f40e604c3afc8\": container with ID starting with e0db8b8042435eed78ead95d40ac39734562e459578ba26e4f0f40e604c3afc8 not found: ID does not exist" containerID="e0db8b8042435eed78ead95d40ac39734562e459578ba26e4f0f40e604c3afc8" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.316146 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0db8b8042435eed78ead95d40ac39734562e459578ba26e4f0f40e604c3afc8"} err="failed to get container status \"e0db8b8042435eed78ead95d40ac39734562e459578ba26e4f0f40e604c3afc8\": rpc error: code = NotFound desc = could not find container \"e0db8b8042435eed78ead95d40ac39734562e459578ba26e4f0f40e604c3afc8\": container with ID starting with e0db8b8042435eed78ead95d40ac39734562e459578ba26e4f0f40e604c3afc8 not found: ID does not exist" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.316179 4756 scope.go:117] "RemoveContainer" containerID="ac26d600defb2938d1ab4c3bc8730842bb10051bb0050048102e489bcddf7cea" Mar 18 14:04:59 crc kubenswrapper[4756]: E0318 14:04:59.317481 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac26d600defb2938d1ab4c3bc8730842bb10051bb0050048102e489bcddf7cea\": container with ID starting with ac26d600defb2938d1ab4c3bc8730842bb10051bb0050048102e489bcddf7cea not found: ID does not exist" containerID="ac26d600defb2938d1ab4c3bc8730842bb10051bb0050048102e489bcddf7cea" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.317506 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac26d600defb2938d1ab4c3bc8730842bb10051bb0050048102e489bcddf7cea"} err="failed to get container status \"ac26d600defb2938d1ab4c3bc8730842bb10051bb0050048102e489bcddf7cea\": rpc error: code = NotFound desc = could not find container \"ac26d600defb2938d1ab4c3bc8730842bb10051bb0050048102e489bcddf7cea\": container with ID starting with ac26d600defb2938d1ab4c3bc8730842bb10051bb0050048102e489bcddf7cea not found: ID does not exist" Mar 18 14:04:59 crc kubenswrapper[4756]: I0318 14:04:59.322883 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" path="/var/lib/kubelet/pods/d5a6ae3a-3fd6-4254-9ca5-8654eee53b55/volumes" Mar 18 14:05:00 crc kubenswrapper[4756]: I0318 14:05:00.623069 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8"] Mar 18 14:05:00 crc kubenswrapper[4756]: I0318 14:05:00.623420 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" podUID="2f81be97-aa68-40db-b435-59dc4773998d" containerName="controller-manager" containerID="cri-o://4de387cdeb92616cb90f321e40421fa1f0381d7b5076fd57856d705b32035614" gracePeriod=30 Mar 18 14:05:00 crc kubenswrapper[4756]: I0318 14:05:00.694738 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6"] Mar 18 14:05:00 crc kubenswrapper[4756]: I0318 14:05:00.694992 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" podUID="794af0e9-709b-4cbd-9d8e-a5b4921a8363" containerName="route-controller-manager" containerID="cri-o://214782fec7483d6da42a8379027bb4f4b9c0d5618a319d560c0e5367a5195681" gracePeriod=30 Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.144505 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.152953 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.188857 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f81be97-aa68-40db-b435-59dc4773998d-serving-cert\") pod \"2f81be97-aa68-40db-b435-59dc4773998d\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.188945 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-proxy-ca-bundles\") pod \"2f81be97-aa68-40db-b435-59dc4773998d\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.189001 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/794af0e9-709b-4cbd-9d8e-a5b4921a8363-serving-cert\") pod \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.189079 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j54z\" (UniqueName: \"kubernetes.io/projected/794af0e9-709b-4cbd-9d8e-a5b4921a8363-kube-api-access-2j54z\") pod \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.189159 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-client-ca\") pod \"2f81be97-aa68-40db-b435-59dc4773998d\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.189253 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/794af0e9-709b-4cbd-9d8e-a5b4921a8363-client-ca\") pod \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.189304 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7zjx\" (UniqueName: \"kubernetes.io/projected/2f81be97-aa68-40db-b435-59dc4773998d-kube-api-access-q7zjx\") pod \"2f81be97-aa68-40db-b435-59dc4773998d\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.189447 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794af0e9-709b-4cbd-9d8e-a5b4921a8363-config\") pod \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\" (UID: \"794af0e9-709b-4cbd-9d8e-a5b4921a8363\") " Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.189498 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-config\") pod \"2f81be97-aa68-40db-b435-59dc4773998d\" (UID: \"2f81be97-aa68-40db-b435-59dc4773998d\") " Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.189954 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2f81be97-aa68-40db-b435-59dc4773998d" (UID: "2f81be97-aa68-40db-b435-59dc4773998d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.190596 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/794af0e9-709b-4cbd-9d8e-a5b4921a8363-client-ca" (OuterVolumeSpecName: "client-ca") pod "794af0e9-709b-4cbd-9d8e-a5b4921a8363" (UID: "794af0e9-709b-4cbd-9d8e-a5b4921a8363"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.194025 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-client-ca" (OuterVolumeSpecName: "client-ca") pod "2f81be97-aa68-40db-b435-59dc4773998d" (UID: "2f81be97-aa68-40db-b435-59dc4773998d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.194720 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/794af0e9-709b-4cbd-9d8e-a5b4921a8363-config" (OuterVolumeSpecName: "config") pod "794af0e9-709b-4cbd-9d8e-a5b4921a8363" (UID: "794af0e9-709b-4cbd-9d8e-a5b4921a8363"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.194885 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794af0e9-709b-4cbd-9d8e-a5b4921a8363-kube-api-access-2j54z" (OuterVolumeSpecName: "kube-api-access-2j54z") pod "794af0e9-709b-4cbd-9d8e-a5b4921a8363" (UID: "794af0e9-709b-4cbd-9d8e-a5b4921a8363"). InnerVolumeSpecName "kube-api-access-2j54z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.195023 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f81be97-aa68-40db-b435-59dc4773998d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2f81be97-aa68-40db-b435-59dc4773998d" (UID: "2f81be97-aa68-40db-b435-59dc4773998d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.195677 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-config" (OuterVolumeSpecName: "config") pod "2f81be97-aa68-40db-b435-59dc4773998d" (UID: "2f81be97-aa68-40db-b435-59dc4773998d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.200366 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794af0e9-709b-4cbd-9d8e-a5b4921a8363-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "794af0e9-709b-4cbd-9d8e-a5b4921a8363" (UID: "794af0e9-709b-4cbd-9d8e-a5b4921a8363"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.200455 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f81be97-aa68-40db-b435-59dc4773998d-kube-api-access-q7zjx" (OuterVolumeSpecName: "kube-api-access-q7zjx") pod "2f81be97-aa68-40db-b435-59dc4773998d" (UID: "2f81be97-aa68-40db-b435-59dc4773998d"). InnerVolumeSpecName "kube-api-access-q7zjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.256712 4756 generic.go:334] "Generic (PLEG): container finished" podID="2f81be97-aa68-40db-b435-59dc4773998d" containerID="4de387cdeb92616cb90f321e40421fa1f0381d7b5076fd57856d705b32035614" exitCode=0 Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.256782 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.256804 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" event={"ID":"2f81be97-aa68-40db-b435-59dc4773998d","Type":"ContainerDied","Data":"4de387cdeb92616cb90f321e40421fa1f0381d7b5076fd57856d705b32035614"} Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.256845 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8" event={"ID":"2f81be97-aa68-40db-b435-59dc4773998d","Type":"ContainerDied","Data":"b75a3a5839ad159a6da30752e0cd2d4c9c0b61f5c1b82b890bf6c806d6635ceb"} Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.256870 4756 scope.go:117] "RemoveContainer" containerID="4de387cdeb92616cb90f321e40421fa1f0381d7b5076fd57856d705b32035614" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.260050 4756 generic.go:334] "Generic (PLEG): container finished" podID="794af0e9-709b-4cbd-9d8e-a5b4921a8363" containerID="214782fec7483d6da42a8379027bb4f4b9c0d5618a319d560c0e5367a5195681" exitCode=0 Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.260089 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" event={"ID":"794af0e9-709b-4cbd-9d8e-a5b4921a8363","Type":"ContainerDied","Data":"214782fec7483d6da42a8379027bb4f4b9c0d5618a319d560c0e5367a5195681"} Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.260143 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" event={"ID":"794af0e9-709b-4cbd-9d8e-a5b4921a8363","Type":"ContainerDied","Data":"0af51f0654295299bf82bbf5e7e824ce04b4b939f68c9893ea0cfe303de1b3ea"} Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.260218 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.292898 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794af0e9-709b-4cbd-9d8e-a5b4921a8363-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.292932 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.292944 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f81be97-aa68-40db-b435-59dc4773998d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.292957 4756 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.292968 4756 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/794af0e9-709b-4cbd-9d8e-a5b4921a8363-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.292979 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j54z\" (UniqueName: \"kubernetes.io/projected/794af0e9-709b-4cbd-9d8e-a5b4921a8363-kube-api-access-2j54z\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.292990 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f81be97-aa68-40db-b435-59dc4773998d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.293000 4756 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/794af0e9-709b-4cbd-9d8e-a5b4921a8363-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.293011 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7zjx\" (UniqueName: \"kubernetes.io/projected/2f81be97-aa68-40db-b435-59dc4773998d-kube-api-access-q7zjx\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.295156 4756 scope.go:117] "RemoveContainer" containerID="4de387cdeb92616cb90f321e40421fa1f0381d7b5076fd57856d705b32035614" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.295991 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8"] Mar 18 14:05:01 crc kubenswrapper[4756]: E0318 14:05:01.297552 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4de387cdeb92616cb90f321e40421fa1f0381d7b5076fd57856d705b32035614\": container with ID starting with 4de387cdeb92616cb90f321e40421fa1f0381d7b5076fd57856d705b32035614 not found: ID does not exist" containerID="4de387cdeb92616cb90f321e40421fa1f0381d7b5076fd57856d705b32035614" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.297624 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4de387cdeb92616cb90f321e40421fa1f0381d7b5076fd57856d705b32035614"} err="failed to get container status \"4de387cdeb92616cb90f321e40421fa1f0381d7b5076fd57856d705b32035614\": rpc error: code = NotFound desc = could not find container \"4de387cdeb92616cb90f321e40421fa1f0381d7b5076fd57856d705b32035614\": container with ID starting with 4de387cdeb92616cb90f321e40421fa1f0381d7b5076fd57856d705b32035614 not found: ID does not exist" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.297862 4756 scope.go:117] "RemoveContainer" containerID="214782fec7483d6da42a8379027bb4f4b9c0d5618a319d560c0e5367a5195681" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.299901 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d5f9ff9d6-nt9q8"] Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.305033 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6"] Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.308440 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595d8b7464-xlmw6"] Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.313852 4756 scope.go:117] "RemoveContainer" containerID="214782fec7483d6da42a8379027bb4f4b9c0d5618a319d560c0e5367a5195681" Mar 18 14:05:01 crc kubenswrapper[4756]: E0318 14:05:01.314251 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"214782fec7483d6da42a8379027bb4f4b9c0d5618a319d560c0e5367a5195681\": container with ID starting with 214782fec7483d6da42a8379027bb4f4b9c0d5618a319d560c0e5367a5195681 not found: ID does not exist" containerID="214782fec7483d6da42a8379027bb4f4b9c0d5618a319d560c0e5367a5195681" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.314295 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"214782fec7483d6da42a8379027bb4f4b9c0d5618a319d560c0e5367a5195681"} err="failed to get container status \"214782fec7483d6da42a8379027bb4f4b9c0d5618a319d560c0e5367a5195681\": rpc error: code = NotFound desc = could not find container \"214782fec7483d6da42a8379027bb4f4b9c0d5618a319d560c0e5367a5195681\": container with ID starting with 214782fec7483d6da42a8379027bb4f4b9c0d5618a319d560c0e5367a5195681 not found: ID does not exist" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.320739 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f81be97-aa68-40db-b435-59dc4773998d" path="/var/lib/kubelet/pods/2f81be97-aa68-40db-b435-59dc4773998d/volumes" Mar 18 14:05:01 crc kubenswrapper[4756]: I0318 14:05:01.321330 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794af0e9-709b-4cbd-9d8e-a5b4921a8363" path="/var/lib/kubelet/pods/794af0e9-709b-4cbd-9d8e-a5b4921a8363/volumes" Mar 18 14:05:01 crc kubenswrapper[4756]: E0318 14:05:01.329478 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod794af0e9_709b_4cbd_9d8e_a5b4921a8363.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f81be97_aa68_40db_b435_59dc4773998d.slice\": RecentStats: unable to find data in memory cache]" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.599473 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4"] Mar 18 14:05:02 crc kubenswrapper[4756]: E0318 14:05:02.599822 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794af0e9-709b-4cbd-9d8e-a5b4921a8363" containerName="route-controller-manager" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.599844 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="794af0e9-709b-4cbd-9d8e-a5b4921a8363" containerName="route-controller-manager" Mar 18 14:05:02 crc kubenswrapper[4756]: E0318 14:05:02.599856 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" containerName="extract-utilities" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.599865 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" containerName="extract-utilities" Mar 18 14:05:02 crc kubenswrapper[4756]: E0318 14:05:02.599882 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" containerName="extract-content" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.599895 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" containerName="extract-content" Mar 18 14:05:02 crc kubenswrapper[4756]: E0318 14:05:02.599907 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" containerName="extract-utilities" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.599917 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" containerName="extract-utilities" Mar 18 14:05:02 crc kubenswrapper[4756]: E0318 14:05:02.599935 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" containerName="registry-server" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.599945 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" containerName="registry-server" Mar 18 14:05:02 crc kubenswrapper[4756]: E0318 14:05:02.599956 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f81be97-aa68-40db-b435-59dc4773998d" containerName="controller-manager" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.599964 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f81be97-aa68-40db-b435-59dc4773998d" containerName="controller-manager" Mar 18 14:05:02 crc kubenswrapper[4756]: E0318 14:05:02.599981 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" containerName="extract-content" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.599991 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" containerName="extract-content" Mar 18 14:05:02 crc kubenswrapper[4756]: E0318 14:05:02.600006 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" containerName="registry-server" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.600015 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" containerName="registry-server" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.600169 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="794af0e9-709b-4cbd-9d8e-a5b4921a8363" containerName="route-controller-manager" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.600189 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f81be97-aa68-40db-b435-59dc4773998d" containerName="controller-manager" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.600208 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c76033c1-1ccb-42ce-ade9-f46428bc0b46" containerName="registry-server" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.600221 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a6ae3a-3fd6-4254-9ca5-8654eee53b55" containerName="registry-server" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.600660 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.603389 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55dfcc958b-qmz42"] Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.604110 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.607839 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.609700 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-serving-cert\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.609733 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23799dbf-9776-429c-84de-877666b2adca-client-ca\") pod \"route-controller-manager-5cf6669bf4-24dn4\" (UID: \"23799dbf-9776-429c-84de-877666b2adca\") " pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.609778 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-config\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.609814 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb7zk\" (UniqueName: \"kubernetes.io/projected/23799dbf-9776-429c-84de-877666b2adca-kube-api-access-nb7zk\") pod \"route-controller-manager-5cf6669bf4-24dn4\" (UID: \"23799dbf-9776-429c-84de-877666b2adca\") " pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.609926 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23799dbf-9776-429c-84de-877666b2adca-serving-cert\") pod \"route-controller-manager-5cf6669bf4-24dn4\" (UID: \"23799dbf-9776-429c-84de-877666b2adca\") " pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.610095 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mh7h\" (UniqueName: \"kubernetes.io/projected/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-kube-api-access-9mh7h\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.610164 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-proxy-ca-bundles\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.610231 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-client-ca\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.610290 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23799dbf-9776-429c-84de-877666b2adca-config\") pod \"route-controller-manager-5cf6669bf4-24dn4\" (UID: \"23799dbf-9776-429c-84de-877666b2adca\") " pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.611672 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.611862 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.611993 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.612162 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.622595 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.622764 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.622795 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.623136 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.623382 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.623467 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.623803 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.631442 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4"] Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.641588 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55dfcc958b-qmz42"] Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.652491 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.711642 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mh7h\" (UniqueName: \"kubernetes.io/projected/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-kube-api-access-9mh7h\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.711892 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-proxy-ca-bundles\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.711935 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-client-ca\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.711961 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23799dbf-9776-429c-84de-877666b2adca-config\") pod \"route-controller-manager-5cf6669bf4-24dn4\" (UID: \"23799dbf-9776-429c-84de-877666b2adca\") " pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.711991 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-serving-cert\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.712006 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23799dbf-9776-429c-84de-877666b2adca-client-ca\") pod \"route-controller-manager-5cf6669bf4-24dn4\" (UID: \"23799dbf-9776-429c-84de-877666b2adca\") " pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.712028 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-config\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.712055 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb7zk\" (UniqueName: \"kubernetes.io/projected/23799dbf-9776-429c-84de-877666b2adca-kube-api-access-nb7zk\") pod \"route-controller-manager-5cf6669bf4-24dn4\" (UID: \"23799dbf-9776-429c-84de-877666b2adca\") " pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.712074 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23799dbf-9776-429c-84de-877666b2adca-serving-cert\") pod \"route-controller-manager-5cf6669bf4-24dn4\" (UID: \"23799dbf-9776-429c-84de-877666b2adca\") " pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.712961 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-proxy-ca-bundles\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.713513 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-client-ca\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.714361 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-config\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.714919 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23799dbf-9776-429c-84de-877666b2adca-client-ca\") pod \"route-controller-manager-5cf6669bf4-24dn4\" (UID: \"23799dbf-9776-429c-84de-877666b2adca\") " pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.715807 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23799dbf-9776-429c-84de-877666b2adca-config\") pod \"route-controller-manager-5cf6669bf4-24dn4\" (UID: \"23799dbf-9776-429c-84de-877666b2adca\") " pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.716340 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-serving-cert\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.731231 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23799dbf-9776-429c-84de-877666b2adca-serving-cert\") pod \"route-controller-manager-5cf6669bf4-24dn4\" (UID: \"23799dbf-9776-429c-84de-877666b2adca\") " pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.735298 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb7zk\" (UniqueName: \"kubernetes.io/projected/23799dbf-9776-429c-84de-877666b2adca-kube-api-access-nb7zk\") pod \"route-controller-manager-5cf6669bf4-24dn4\" (UID: \"23799dbf-9776-429c-84de-877666b2adca\") " pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.737377 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mh7h\" (UniqueName: \"kubernetes.io/projected/37ffe7c5-4b82-48c8-8003-8f0c3f21838d-kube-api-access-9mh7h\") pod \"controller-manager-55dfcc958b-qmz42\" (UID: \"37ffe7c5-4b82-48c8-8003-8f0c3f21838d\") " pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.929005 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:02 crc kubenswrapper[4756]: I0318 14:05:02.940555 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.324754 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4"] Mar 18 14:05:03 crc kubenswrapper[4756]: W0318 14:05:03.328637 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23799dbf_9776_429c_84de_877666b2adca.slice/crio-f52249602eb752879db913b1f2f6c056e0e5e407ce84f30ba623febfddf94ef9 WatchSource:0}: Error finding container f52249602eb752879db913b1f2f6c056e0e5e407ce84f30ba623febfddf94ef9: Status 404 returned error can't find the container with id f52249602eb752879db913b1f2f6c056e0e5e407ce84f30ba623febfddf94ef9 Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.365333 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55dfcc958b-qmz42"] Mar 18 14:05:03 crc kubenswrapper[4756]: W0318 14:05:03.371354 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37ffe7c5_4b82_48c8_8003_8f0c3f21838d.slice/crio-0faa86f30f747ca3646093b68a0518f704947172f0455d7402bddddacb6cc604 WatchSource:0}: Error finding container 0faa86f30f747ca3646093b68a0518f704947172f0455d7402bddddacb6cc604: Status 404 returned error can't find the container with id 0faa86f30f747ca3646093b68a0518f704947172f0455d7402bddddacb6cc604 Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.817820 4756 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.818630 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.823441 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.823649 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.823727 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.823812 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.823906 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.846859 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.892720 4756 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.893218 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80" gracePeriod=15 Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.893452 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8" gracePeriod=15 Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.893583 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b" gracePeriod=15 Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.893710 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34" gracePeriod=15 Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.893804 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1" gracePeriod=15 Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895330 4756 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 14:05:03 crc kubenswrapper[4756]: E0318 14:05:03.895582 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895596 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: E0318 14:05:03.895606 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895613 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: E0318 14:05:03.895621 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895628 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 14:05:03 crc kubenswrapper[4756]: E0318 14:05:03.895637 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895644 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 14:05:03 crc kubenswrapper[4756]: E0318 14:05:03.895656 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895663 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 14:05:03 crc kubenswrapper[4756]: E0318 14:05:03.895679 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895686 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: E0318 14:05:03.895694 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895701 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: E0318 14:05:03.895713 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895720 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 14:05:03 crc kubenswrapper[4756]: E0318 14:05:03.895732 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895739 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895851 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895897 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895909 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895918 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895926 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895935 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895944 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.895953 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: E0318 14:05:03.896068 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.896078 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.896199 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.925656 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.925978 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.926012 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.926034 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.926048 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.926071 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.926097 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.926176 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.926263 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.926295 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.926316 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.926337 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:03 crc kubenswrapper[4756]: I0318 14:05:03.926356 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.027023 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.027107 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.027157 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.027274 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.027335 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.027373 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.144839 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:04 crc kubenswrapper[4756]: W0318 14:05:04.161017 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-0997092dbfa87b4f292d07d2d82cbc057b532e8cdcb25427169b16618ebf2db2 WatchSource:0}: Error finding container 0997092dbfa87b4f292d07d2d82cbc057b532e8cdcb25427169b16618ebf2db2: Status 404 returned error can't find the container with id 0997092dbfa87b4f292d07d2d82cbc057b532e8cdcb25427169b16618ebf2db2 Mar 18 14:05:04 crc kubenswrapper[4756]: E0318 14:05:04.163677 4756 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.34:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189df48a8d0bfbae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:05:04.16288043 +0000 UTC m=+305.477298405,LastTimestamp:2026-03-18 14:05:04.16288043 +0000 UTC m=+305.477298405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.281720 4756 generic.go:334] "Generic (PLEG): container finished" podID="356d2f47-a922-458e-8578-a79ee650e100" containerID="5358e7736ff09b1c3b89f9788bef458ec6a748c78830cdd8ec71e6314feda929" exitCode=0 Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.281844 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"356d2f47-a922-458e-8578-a79ee650e100","Type":"ContainerDied","Data":"5358e7736ff09b1c3b89f9788bef458ec6a748c78830cdd8ec71e6314feda929"} Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.282623 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.282960 4756 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.283478 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.283782 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" event={"ID":"23799dbf-9776-429c-84de-877666b2adca","Type":"ContainerStarted","Data":"6f11781b1f8fd2709a40f318ff39e7ebf42b6d79611fc7224ece1db9c9f9fd7c"} Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.283812 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" event={"ID":"23799dbf-9776-429c-84de-877666b2adca","Type":"ContainerStarted","Data":"f52249602eb752879db913b1f2f6c056e0e5e407ce84f30ba623febfddf94ef9"} Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.284143 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.286053 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.286527 4756 status_manager.go:851] "Failed to get status for pod" podUID="23799dbf-9776-429c-84de-877666b2adca" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5cf6669bf4-24dn4\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.286899 4756 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.287220 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.287416 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.288567 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.289216 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8" exitCode=0 Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.289244 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b" exitCode=0 Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.289256 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34" exitCode=0 Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.289269 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1" exitCode=2 Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.289329 4756 scope.go:117] "RemoveContainer" containerID="c469f5f646cab15958c71257ca9710364a0cb587aa832a6a4c3d6c5169fc097a" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.289838 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.290338 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.290408 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0997092dbfa87b4f292d07d2d82cbc057b532e8cdcb25427169b16618ebf2db2"} Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.290730 4756 status_manager.go:851] "Failed to get status for pod" podUID="23799dbf-9776-429c-84de-877666b2adca" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5cf6669bf4-24dn4\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.291098 4756 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.291527 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.291980 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" event={"ID":"37ffe7c5-4b82-48c8-8003-8f0c3f21838d","Type":"ContainerStarted","Data":"dea0595af07487c1015e51f0a05ed6446d76801c0d915a83e0ab8e77b46ae26a"} Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.292027 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" event={"ID":"37ffe7c5-4b82-48c8-8003-8f0c3f21838d","Type":"ContainerStarted","Data":"0faa86f30f747ca3646093b68a0518f704947172f0455d7402bddddacb6cc604"} Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.292234 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.292622 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.292954 4756 status_manager.go:851] "Failed to get status for pod" podUID="37ffe7c5-4b82-48c8-8003-8f0c3f21838d" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-55dfcc958b-qmz42\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.293257 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.293503 4756 status_manager.go:851] "Failed to get status for pod" podUID="23799dbf-9776-429c-84de-877666b2adca" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5cf6669bf4-24dn4\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.293745 4756 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.296322 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.296654 4756 status_manager.go:851] "Failed to get status for pod" podUID="23799dbf-9776-429c-84de-877666b2adca" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5cf6669bf4-24dn4\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.296945 4756 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.297191 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.297427 4756 status_manager.go:851] "Failed to get status for pod" podUID="37ffe7c5-4b82-48c8-8003-8f0c3f21838d" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-55dfcc958b-qmz42\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:04 crc kubenswrapper[4756]: I0318 14:05:04.297679 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.304262 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.307260 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c3871e675d7e5c9c7f87dcc384977ce8e00de046736fd7c02ad7c112bb2c61ba"} Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.308868 4756 status_manager.go:851] "Failed to get status for pod" podUID="23799dbf-9776-429c-84de-877666b2adca" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5cf6669bf4-24dn4\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.309287 4756 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.309652 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.309966 4756 status_manager.go:851] "Failed to get status for pod" podUID="37ffe7c5-4b82-48c8-8003-8f0c3f21838d" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-55dfcc958b-qmz42\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.310321 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.577867 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.578817 4756 status_manager.go:851] "Failed to get status for pod" podUID="37ffe7c5-4b82-48c8-8003-8f0c3f21838d" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-55dfcc958b-qmz42\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.579294 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.579643 4756 status_manager.go:851] "Failed to get status for pod" podUID="23799dbf-9776-429c-84de-877666b2adca" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5cf6669bf4-24dn4\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.579899 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.749776 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/356d2f47-a922-458e-8578-a79ee650e100-kubelet-dir\") pod \"356d2f47-a922-458e-8578-a79ee650e100\" (UID: \"356d2f47-a922-458e-8578-a79ee650e100\") " Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.749933 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/356d2f47-a922-458e-8578-a79ee650e100-var-lock\") pod \"356d2f47-a922-458e-8578-a79ee650e100\" (UID: \"356d2f47-a922-458e-8578-a79ee650e100\") " Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.750014 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/356d2f47-a922-458e-8578-a79ee650e100-kube-api-access\") pod \"356d2f47-a922-458e-8578-a79ee650e100\" (UID: \"356d2f47-a922-458e-8578-a79ee650e100\") " Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.750063 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/356d2f47-a922-458e-8578-a79ee650e100-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "356d2f47-a922-458e-8578-a79ee650e100" (UID: "356d2f47-a922-458e-8578-a79ee650e100"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.750067 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/356d2f47-a922-458e-8578-a79ee650e100-var-lock" (OuterVolumeSpecName: "var-lock") pod "356d2f47-a922-458e-8578-a79ee650e100" (UID: "356d2f47-a922-458e-8578-a79ee650e100"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.750559 4756 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/356d2f47-a922-458e-8578-a79ee650e100-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.750597 4756 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/356d2f47-a922-458e-8578-a79ee650e100-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.758294 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/356d2f47-a922-458e-8578-a79ee650e100-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "356d2f47-a922-458e-8578-a79ee650e100" (UID: "356d2f47-a922-458e-8578-a79ee650e100"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:05:05 crc kubenswrapper[4756]: I0318 14:05:05.852089 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/356d2f47-a922-458e-8578-a79ee650e100-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.276451 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.277726 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.278330 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.278780 4756 status_manager.go:851] "Failed to get status for pod" podUID="37ffe7c5-4b82-48c8-8003-8f0c3f21838d" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-55dfcc958b-qmz42\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.279155 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.279692 4756 status_manager.go:851] "Failed to get status for pod" podUID="23799dbf-9776-429c-84de-877666b2adca" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5cf6669bf4-24dn4\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.280013 4756 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.316888 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.317884 4756 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80" exitCode=0 Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.317965 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.317968 4756 scope.go:117] "RemoveContainer" containerID="9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.319935 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.320259 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"356d2f47-a922-458e-8578-a79ee650e100","Type":"ContainerDied","Data":"3d2ceb5e165da57bfb6d3d52028651bd70ffe7e7ba51a707dbd268a4911957df"} Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.320288 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d2ceb5e165da57bfb6d3d52028651bd70ffe7e7ba51a707dbd268a4911957df" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.334671 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.337997 4756 scope.go:117] "RemoveContainer" containerID="3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.339303 4756 status_manager.go:851] "Failed to get status for pod" podUID="23799dbf-9776-429c-84de-877666b2adca" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5cf6669bf4-24dn4\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.339911 4756 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.340398 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.340753 4756 status_manager.go:851] "Failed to get status for pod" podUID="37ffe7c5-4b82-48c8-8003-8f0c3f21838d" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-55dfcc958b-qmz42\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.350881 4756 scope.go:117] "RemoveContainer" containerID="1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.359847 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.360024 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.360077 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.361519 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.361589 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.361790 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.362744 4756 scope.go:117] "RemoveContainer" containerID="9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.376454 4756 scope.go:117] "RemoveContainer" containerID="43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.397128 4756 scope.go:117] "RemoveContainer" containerID="bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.418058 4756 scope.go:117] "RemoveContainer" containerID="9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8" Mar 18 14:05:06 crc kubenswrapper[4756]: E0318 14:05:06.418416 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\": container with ID starting with 9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8 not found: ID does not exist" containerID="9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.418459 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8"} err="failed to get container status \"9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\": rpc error: code = NotFound desc = could not find container \"9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8\": container with ID starting with 9b3d6cdaa002eb58ff3cb0791df5c9d58a4210e7269de9b82164241425ba7dd8 not found: ID does not exist" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.418488 4756 scope.go:117] "RemoveContainer" containerID="3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b" Mar 18 14:05:06 crc kubenswrapper[4756]: E0318 14:05:06.419196 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\": container with ID starting with 3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b not found: ID does not exist" containerID="3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.419252 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b"} err="failed to get container status \"3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\": rpc error: code = NotFound desc = could not find container \"3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b\": container with ID starting with 3525e6d0c7cffc5f507daec2e8c3c9156f5a5410eb32ebcba1fda737057bdc8b not found: ID does not exist" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.419271 4756 scope.go:117] "RemoveContainer" containerID="1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34" Mar 18 14:05:06 crc kubenswrapper[4756]: E0318 14:05:06.419584 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\": container with ID starting with 1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34 not found: ID does not exist" containerID="1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.419623 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34"} err="failed to get container status \"1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\": rpc error: code = NotFound desc = could not find container \"1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34\": container with ID starting with 1c89745a8bb61730bfd6266cca938601e3000e494a195740bf1fa3d761207a34 not found: ID does not exist" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.419638 4756 scope.go:117] "RemoveContainer" containerID="9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1" Mar 18 14:05:06 crc kubenswrapper[4756]: E0318 14:05:06.419980 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\": container with ID starting with 9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1 not found: ID does not exist" containerID="9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.420017 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1"} err="failed to get container status \"9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\": rpc error: code = NotFound desc = could not find container \"9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1\": container with ID starting with 9eb71bc3159599af3a3509ad84fef4cc80abd616a2c734e379b4bc82579e0bb1 not found: ID does not exist" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.420034 4756 scope.go:117] "RemoveContainer" containerID="43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80" Mar 18 14:05:06 crc kubenswrapper[4756]: E0318 14:05:06.421371 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\": container with ID starting with 43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80 not found: ID does not exist" containerID="43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.421397 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80"} err="failed to get container status \"43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\": rpc error: code = NotFound desc = could not find container \"43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80\": container with ID starting with 43dbcfb5c7910eae1e195758374300740906e44e3a0288b536e2426f108d7f80 not found: ID does not exist" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.421411 4756 scope.go:117] "RemoveContainer" containerID="bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2" Mar 18 14:05:06 crc kubenswrapper[4756]: E0318 14:05:06.421688 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\": container with ID starting with bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2 not found: ID does not exist" containerID="bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.421717 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2"} err="failed to get container status \"bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\": rpc error: code = NotFound desc = could not find container \"bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2\": container with ID starting with bbac22a7cf0268941f0a01cbd3179e5f52acc6fb64b77dd56913826ffa1242f2 not found: ID does not exist" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.462379 4756 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.462418 4756 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.462432 4756 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.638397 4756 status_manager.go:851] "Failed to get status for pod" podUID="23799dbf-9776-429c-84de-877666b2adca" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5cf6669bf4-24dn4\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.638937 4756 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.639204 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.639436 4756 status_manager.go:851] "Failed to get status for pod" podUID="37ffe7c5-4b82-48c8-8003-8f0c3f21838d" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-55dfcc958b-qmz42\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.639713 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.915019 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.915083 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.915208 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.915995 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:05:06 crc kubenswrapper[4756]: I0318 14:05:06.916090 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670" gracePeriod=600 Mar 18 14:05:07 crc kubenswrapper[4756]: I0318 14:05:07.322499 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 14:05:07 crc kubenswrapper[4756]: I0318 14:05:07.330196 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670" exitCode=0 Mar 18 14:05:07 crc kubenswrapper[4756]: I0318 14:05:07.330244 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670"} Mar 18 14:05:07 crc kubenswrapper[4756]: I0318 14:05:07.330315 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"7986574aedd1bbfebd839420217d09df3d36a81aa68ea117b5469df20091c844"} Mar 18 14:05:07 crc kubenswrapper[4756]: I0318 14:05:07.330932 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:07 crc kubenswrapper[4756]: I0318 14:05:07.331210 4756 status_manager.go:851] "Failed to get status for pod" podUID="37ffe7c5-4b82-48c8-8003-8f0c3f21838d" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-55dfcc958b-qmz42\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:07 crc kubenswrapper[4756]: I0318 14:05:07.331456 4756 status_manager.go:851] "Failed to get status for pod" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qvpkg\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:07 crc kubenswrapper[4756]: I0318 14:05:07.331760 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:07 crc kubenswrapper[4756]: I0318 14:05:07.331989 4756 status_manager.go:851] "Failed to get status for pod" podUID="23799dbf-9776-429c-84de-877666b2adca" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5cf6669bf4-24dn4\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:09 crc kubenswrapper[4756]: I0318 14:05:09.329522 4756 status_manager.go:851] "Failed to get status for pod" podUID="23799dbf-9776-429c-84de-877666b2adca" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5cf6669bf4-24dn4\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:09 crc kubenswrapper[4756]: I0318 14:05:09.330360 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:09 crc kubenswrapper[4756]: I0318 14:05:09.330620 4756 status_manager.go:851] "Failed to get status for pod" podUID="37ffe7c5-4b82-48c8-8003-8f0c3f21838d" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-55dfcc958b-qmz42\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:09 crc kubenswrapper[4756]: I0318 14:05:09.330861 4756 status_manager.go:851] "Failed to get status for pod" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qvpkg\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:09 crc kubenswrapper[4756]: I0318 14:05:09.331080 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:13 crc kubenswrapper[4756]: E0318 14:05:13.160692 4756 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.34:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189df48a8d0bfbae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 14:05:04.16288043 +0000 UTC m=+305.477298405,LastTimestamp:2026-03-18 14:05:04.16288043 +0000 UTC m=+305.477298405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 14:05:14 crc kubenswrapper[4756]: E0318 14:05:14.200899 4756 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:14 crc kubenswrapper[4756]: E0318 14:05:14.201258 4756 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:14 crc kubenswrapper[4756]: E0318 14:05:14.201543 4756 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:14 crc kubenswrapper[4756]: E0318 14:05:14.201769 4756 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:14 crc kubenswrapper[4756]: E0318 14:05:14.202102 4756 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:14 crc kubenswrapper[4756]: I0318 14:05:14.202189 4756 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 14:05:14 crc kubenswrapper[4756]: E0318 14:05:14.202458 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" interval="200ms" Mar 18 14:05:14 crc kubenswrapper[4756]: E0318 14:05:14.403978 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" interval="400ms" Mar 18 14:05:14 crc kubenswrapper[4756]: E0318 14:05:14.804845 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" interval="800ms" Mar 18 14:05:15 crc kubenswrapper[4756]: I0318 14:05:15.315177 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:15 crc kubenswrapper[4756]: I0318 14:05:15.316518 4756 status_manager.go:851] "Failed to get status for pod" podUID="23799dbf-9776-429c-84de-877666b2adca" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5cf6669bf4-24dn4\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:15 crc kubenswrapper[4756]: I0318 14:05:15.317111 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:15 crc kubenswrapper[4756]: I0318 14:05:15.317602 4756 status_manager.go:851] "Failed to get status for pod" podUID="37ffe7c5-4b82-48c8-8003-8f0c3f21838d" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-55dfcc958b-qmz42\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:15 crc kubenswrapper[4756]: I0318 14:05:15.317899 4756 status_manager.go:851] "Failed to get status for pod" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qvpkg\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:15 crc kubenswrapper[4756]: I0318 14:05:15.318221 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:15 crc kubenswrapper[4756]: I0318 14:05:15.342784 4756 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9ee2d78-d52a-4766-aa8e-f68a998a4df5" Mar 18 14:05:15 crc kubenswrapper[4756]: I0318 14:05:15.342838 4756 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9ee2d78-d52a-4766-aa8e-f68a998a4df5" Mar 18 14:05:15 crc kubenswrapper[4756]: E0318 14:05:15.343684 4756 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:15 crc kubenswrapper[4756]: I0318 14:05:15.344687 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:15 crc kubenswrapper[4756]: E0318 14:05:15.605900 4756 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.34:6443: connect: connection refused" interval="1.6s" Mar 18 14:05:16 crc kubenswrapper[4756]: I0318 14:05:16.392332 4756 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3ff92694e3a37345a2eae631eb7f580f174fa7d10cb87ae3a93e2f964384c817" exitCode=0 Mar 18 14:05:16 crc kubenswrapper[4756]: I0318 14:05:16.392387 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3ff92694e3a37345a2eae631eb7f580f174fa7d10cb87ae3a93e2f964384c817"} Mar 18 14:05:16 crc kubenswrapper[4756]: I0318 14:05:16.392445 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d9862d210104ae38b06df5d606c259be746b0d1cd943d33c8902c6c7e20ecb55"} Mar 18 14:05:16 crc kubenswrapper[4756]: I0318 14:05:16.392734 4756 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9ee2d78-d52a-4766-aa8e-f68a998a4df5" Mar 18 14:05:16 crc kubenswrapper[4756]: I0318 14:05:16.393020 4756 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9ee2d78-d52a-4766-aa8e-f68a998a4df5" Mar 18 14:05:16 crc kubenswrapper[4756]: I0318 14:05:16.393444 4756 status_manager.go:851] "Failed to get status for pod" podUID="356d2f47-a922-458e-8578-a79ee650e100" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:16 crc kubenswrapper[4756]: E0318 14:05:16.393588 4756 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:16 crc kubenswrapper[4756]: I0318 14:05:16.394072 4756 status_manager.go:851] "Failed to get status for pod" podUID="23799dbf-9776-429c-84de-877666b2adca" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5cf6669bf4-24dn4\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:16 crc kubenswrapper[4756]: I0318 14:05:16.394584 4756 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:16 crc kubenswrapper[4756]: I0318 14:05:16.394929 4756 status_manager.go:851] "Failed to get status for pod" podUID="37ffe7c5-4b82-48c8-8003-8f0c3f21838d" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-55dfcc958b-qmz42\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:16 crc kubenswrapper[4756]: I0318 14:05:16.395431 4756 status_manager.go:851] "Failed to get status for pod" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qvpkg\": dial tcp 38.129.56.34:6443: connect: connection refused" Mar 18 14:05:17 crc kubenswrapper[4756]: I0318 14:05:17.406627 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d0913b7a71173b8db8d58401a53ca5e438fbe14f5bec9ff504561a3a7664e72a"} Mar 18 14:05:17 crc kubenswrapper[4756]: I0318 14:05:17.407660 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0c1de1777fb36bf33372f517e278ec627c87f20e9ba084fd779b3ebb3eaeba55"} Mar 18 14:05:17 crc kubenswrapper[4756]: I0318 14:05:17.407742 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"370555a959f0eef4d1a53b21e831bd779a6fd4c6885a7b82401aa7751b9c28ad"} Mar 18 14:05:17 crc kubenswrapper[4756]: I0318 14:05:17.407807 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8760a71a3bbfd66fcd4e502817687b192fec4b00e6a0d239fd1ccd84b38305ce"} Mar 18 14:05:18 crc kubenswrapper[4756]: I0318 14:05:18.418608 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 14:05:18 crc kubenswrapper[4756]: I0318 14:05:18.419431 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 14:05:18 crc kubenswrapper[4756]: I0318 14:05:18.419469 4756 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c" exitCode=1 Mar 18 14:05:18 crc kubenswrapper[4756]: I0318 14:05:18.419520 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c"} Mar 18 14:05:18 crc kubenswrapper[4756]: I0318 14:05:18.419849 4756 scope.go:117] "RemoveContainer" containerID="cef7861ecc049c88bccbecfcaaa1bd0aa041a5d328f69f7694d69766740cfb0c" Mar 18 14:05:18 crc kubenswrapper[4756]: I0318 14:05:18.424194 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1922a14e8850a1248303629154b314341a15a036f126114262525fa4e01203d4"} Mar 18 14:05:18 crc kubenswrapper[4756]: I0318 14:05:18.425380 4756 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9ee2d78-d52a-4766-aa8e-f68a998a4df5" Mar 18 14:05:18 crc kubenswrapper[4756]: I0318 14:05:18.425621 4756 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9ee2d78-d52a-4766-aa8e-f68a998a4df5" Mar 18 14:05:18 crc kubenswrapper[4756]: I0318 14:05:18.979501 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" podUID="d3f5ac66-56e5-4477-ac1b-1ef496242243" containerName="oauth-openshift" containerID="cri-o://55017121b61bac574aa0603c66a1c0c5e0720b60d3fa0d1712ea5970acfd5e14" gracePeriod=15 Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.427946 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.433379 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.433871 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.433948 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e82670e3d472b70f573277034da7116f4e667ea3a5c0c60e304732818f871eee"} Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.435653 4756 generic.go:334] "Generic (PLEG): container finished" podID="d3f5ac66-56e5-4477-ac1b-1ef496242243" containerID="55017121b61bac574aa0603c66a1c0c5e0720b60d3fa0d1712ea5970acfd5e14" exitCode=0 Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.435692 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" event={"ID":"d3f5ac66-56e5-4477-ac1b-1ef496242243","Type":"ContainerDied","Data":"55017121b61bac574aa0603c66a1c0c5e0720b60d3fa0d1712ea5970acfd5e14"} Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.435712 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" event={"ID":"d3f5ac66-56e5-4477-ac1b-1ef496242243","Type":"ContainerDied","Data":"685be2bf704064d590cd43d659e3dde4deb31eaa99fe1903e29d70a8c32d1b0c"} Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.435732 4756 scope.go:117] "RemoveContainer" containerID="55017121b61bac574aa0603c66a1c0c5e0720b60d3fa0d1712ea5970acfd5e14" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.435771 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9sqqb" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.458186 4756 scope.go:117] "RemoveContainer" containerID="55017121b61bac574aa0603c66a1c0c5e0720b60d3fa0d1712ea5970acfd5e14" Mar 18 14:05:19 crc kubenswrapper[4756]: E0318 14:05:19.458761 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55017121b61bac574aa0603c66a1c0c5e0720b60d3fa0d1712ea5970acfd5e14\": container with ID starting with 55017121b61bac574aa0603c66a1c0c5e0720b60d3fa0d1712ea5970acfd5e14 not found: ID does not exist" containerID="55017121b61bac574aa0603c66a1c0c5e0720b60d3fa0d1712ea5970acfd5e14" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.458794 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55017121b61bac574aa0603c66a1c0c5e0720b60d3fa0d1712ea5970acfd5e14"} err="failed to get container status \"55017121b61bac574aa0603c66a1c0c5e0720b60d3fa0d1712ea5970acfd5e14\": rpc error: code = NotFound desc = could not find container \"55017121b61bac574aa0603c66a1c0c5e0720b60d3fa0d1712ea5970acfd5e14\": container with ID starting with 55017121b61bac574aa0603c66a1c0c5e0720b60d3fa0d1712ea5970acfd5e14 not found: ID does not exist" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.536897 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-provider-selection\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.537285 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3f5ac66-56e5-4477-ac1b-1ef496242243-audit-dir\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.537416 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3f5ac66-56e5-4477-ac1b-1ef496242243-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.537551 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np5tk\" (UniqueName: \"kubernetes.io/projected/d3f5ac66-56e5-4477-ac1b-1ef496242243-kube-api-access-np5tk\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.537653 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-cliconfig\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.537739 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-trusted-ca-bundle\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.537816 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-router-certs\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.537932 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-session\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.538081 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-idp-0-file-data\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.538199 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-serving-cert\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.538555 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.538698 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-ocp-branding-template\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.538781 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-error\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.538825 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.538876 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-audit-policies\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.539010 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-service-ca\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.539098 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-login\") pod \"d3f5ac66-56e5-4477-ac1b-1ef496242243\" (UID: \"d3f5ac66-56e5-4477-ac1b-1ef496242243\") " Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.539469 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.539547 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.539623 4756 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3f5ac66-56e5-4477-ac1b-1ef496242243-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.539489 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.539931 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.543432 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.543717 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.543818 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.544263 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.544331 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.544730 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.544893 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f5ac66-56e5-4477-ac1b-1ef496242243-kube-api-access-np5tk" (OuterVolumeSpecName: "kube-api-access-np5tk") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "kube-api-access-np5tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.545019 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.547568 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d3f5ac66-56e5-4477-ac1b-1ef496242243" (UID: "d3f5ac66-56e5-4477-ac1b-1ef496242243"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.641012 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.641334 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.641422 4756 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.641512 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.641623 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.641712 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.641818 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np5tk\" (UniqueName: \"kubernetes.io/projected/d3f5ac66-56e5-4477-ac1b-1ef496242243-kube-api-access-np5tk\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.641897 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.641989 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.642069 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:19 crc kubenswrapper[4756]: I0318 14:05:19.642173 4756 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f5ac66-56e5-4477-ac1b-1ef496242243-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.345076 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.345150 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.350629 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.350695 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.351050 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.353052 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.353212 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.362142 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.365431 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.447148 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.451678 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.451769 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.453909 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.464408 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.478505 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.481693 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.733697 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:05:20 crc kubenswrapper[4756]: I0318 14:05:20.742317 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 14:05:20 crc kubenswrapper[4756]: W0318 14:05:20.881275 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-915514b6739695a6bd2b8a4baaf07d6e2d6b85a78f9eb30fd1c68402ae1b21c3 WatchSource:0}: Error finding container 915514b6739695a6bd2b8a4baaf07d6e2d6b85a78f9eb30fd1c68402ae1b21c3: Status 404 returned error can't find the container with id 915514b6739695a6bd2b8a4baaf07d6e2d6b85a78f9eb30fd1c68402ae1b21c3 Mar 18 14:05:21 crc kubenswrapper[4756]: W0318 14:05:21.153325 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-c4cfa4a0c158359d4abea269b52664b3c6a379d5963b4a8542c0535297fcfcd7 WatchSource:0}: Error finding container c4cfa4a0c158359d4abea269b52664b3c6a379d5963b4a8542c0535297fcfcd7: Status 404 returned error can't find the container with id c4cfa4a0c158359d4abea269b52664b3c6a379d5963b4a8542c0535297fcfcd7 Mar 18 14:05:21 crc kubenswrapper[4756]: W0318 14:05:21.232041 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-77e5dedc3f80c86c43542e3c48eb6dde566c3f63b89342d3a9e32d1511d256c5 WatchSource:0}: Error finding container 77e5dedc3f80c86c43542e3c48eb6dde566c3f63b89342d3a9e32d1511d256c5: Status 404 returned error can't find the container with id 77e5dedc3f80c86c43542e3c48eb6dde566c3f63b89342d3a9e32d1511d256c5 Mar 18 14:05:21 crc kubenswrapper[4756]: I0318 14:05:21.450389 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0cb67367af6c914fd00c61e888f5f36e4d454cf6dbde9576866a10b47353b9c4"} Mar 18 14:05:21 crc kubenswrapper[4756]: I0318 14:05:21.450650 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c4cfa4a0c158359d4abea269b52664b3c6a379d5963b4a8542c0535297fcfcd7"} Mar 18 14:05:21 crc kubenswrapper[4756]: I0318 14:05:21.450851 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:05:21 crc kubenswrapper[4756]: I0318 14:05:21.452569 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4830d52eafdf8957440a9e7c6cb48978878a028da78cdd3fb200e8daf4263248"} Mar 18 14:05:21 crc kubenswrapper[4756]: I0318 14:05:21.452625 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"77e5dedc3f80c86c43542e3c48eb6dde566c3f63b89342d3a9e32d1511d256c5"} Mar 18 14:05:21 crc kubenswrapper[4756]: I0318 14:05:21.454690 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f00d0d5a86fcdf5f77cfe4d24df5ff273c001982d782e201c7f8dc4e670523dd"} Mar 18 14:05:21 crc kubenswrapper[4756]: I0318 14:05:21.454744 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"915514b6739695a6bd2b8a4baaf07d6e2d6b85a78f9eb30fd1c68402ae1b21c3"} Mar 18 14:05:22 crc kubenswrapper[4756]: I0318 14:05:22.932815 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:05:23 crc kubenswrapper[4756]: I0318 14:05:23.434383 4756 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:23 crc kubenswrapper[4756]: I0318 14:05:23.469099 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 18 14:05:23 crc kubenswrapper[4756]: I0318 14:05:23.469199 4756 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="4830d52eafdf8957440a9e7c6cb48978878a028da78cdd3fb200e8daf4263248" exitCode=255 Mar 18 14:05:23 crc kubenswrapper[4756]: I0318 14:05:23.469305 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"4830d52eafdf8957440a9e7c6cb48978878a028da78cdd3fb200e8daf4263248"} Mar 18 14:05:23 crc kubenswrapper[4756]: I0318 14:05:23.469634 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:23 crc kubenswrapper[4756]: I0318 14:05:23.469747 4756 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9ee2d78-d52a-4766-aa8e-f68a998a4df5" Mar 18 14:05:23 crc kubenswrapper[4756]: I0318 14:05:23.469778 4756 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9ee2d78-d52a-4766-aa8e-f68a998a4df5" Mar 18 14:05:23 crc kubenswrapper[4756]: I0318 14:05:23.470112 4756 scope.go:117] "RemoveContainer" containerID="4830d52eafdf8957440a9e7c6cb48978878a028da78cdd3fb200e8daf4263248" Mar 18 14:05:23 crc kubenswrapper[4756]: I0318 14:05:23.477799 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:23 crc kubenswrapper[4756]: I0318 14:05:23.637729 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="efadec30-79e9-4228-82b6-469da7547836" Mar 18 14:05:24 crc kubenswrapper[4756]: I0318 14:05:24.478611 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 14:05:24 crc kubenswrapper[4756]: I0318 14:05:24.479597 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 18 14:05:24 crc kubenswrapper[4756]: I0318 14:05:24.479647 4756 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="172409e3d312e48c5a4930e02722e189ba2a62d56ecc4b89efc48c210b010c00" exitCode=255 Mar 18 14:05:24 crc kubenswrapper[4756]: I0318 14:05:24.480073 4756 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9ee2d78-d52a-4766-aa8e-f68a998a4df5" Mar 18 14:05:24 crc kubenswrapper[4756]: I0318 14:05:24.480095 4756 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9ee2d78-d52a-4766-aa8e-f68a998a4df5" Mar 18 14:05:24 crc kubenswrapper[4756]: I0318 14:05:24.480427 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"172409e3d312e48c5a4930e02722e189ba2a62d56ecc4b89efc48c210b010c00"} Mar 18 14:05:24 crc kubenswrapper[4756]: I0318 14:05:24.480474 4756 scope.go:117] "RemoveContainer" containerID="4830d52eafdf8957440a9e7c6cb48978878a028da78cdd3fb200e8daf4263248" Mar 18 14:05:24 crc kubenswrapper[4756]: I0318 14:05:24.481330 4756 scope.go:117] "RemoveContainer" containerID="172409e3d312e48c5a4930e02722e189ba2a62d56ecc4b89efc48c210b010c00" Mar 18 14:05:24 crc kubenswrapper[4756]: E0318 14:05:24.481886 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:05:24 crc kubenswrapper[4756]: I0318 14:05:24.496610 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="efadec30-79e9-4228-82b6-469da7547836" Mar 18 14:05:25 crc kubenswrapper[4756]: I0318 14:05:25.094298 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:05:25 crc kubenswrapper[4756]: I0318 14:05:25.103906 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:05:25 crc kubenswrapper[4756]: I0318 14:05:25.487709 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 14:05:25 crc kubenswrapper[4756]: I0318 14:05:25.488538 4756 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9ee2d78-d52a-4766-aa8e-f68a998a4df5" Mar 18 14:05:25 crc kubenswrapper[4756]: I0318 14:05:25.488566 4756 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9ee2d78-d52a-4766-aa8e-f68a998a4df5" Mar 18 14:05:25 crc kubenswrapper[4756]: I0318 14:05:25.491505 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="efadec30-79e9-4228-82b6-469da7547836" Mar 18 14:05:32 crc kubenswrapper[4756]: I0318 14:05:32.937661 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 14:05:33 crc kubenswrapper[4756]: I0318 14:05:33.775164 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 14:05:34 crc kubenswrapper[4756]: I0318 14:05:34.367565 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 14:05:34 crc kubenswrapper[4756]: I0318 14:05:34.427853 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 14:05:34 crc kubenswrapper[4756]: I0318 14:05:34.817989 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 14:05:34 crc kubenswrapper[4756]: I0318 14:05:34.924585 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.074365 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.643167 4756 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.643445 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=32.643390443 podStartE2EDuration="32.643390443s" podCreationTimestamp="2026-03-18 14:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:05:23.452429638 +0000 UTC m=+324.766847623" watchObservedRunningTime="2026-03-18 14:05:35.643390443 +0000 UTC m=+336.957808428" Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.644438 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cf6669bf4-24dn4" podStartSLOduration=35.644424861 podStartE2EDuration="35.644424861s" podCreationTimestamp="2026-03-18 14:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:05:23.443834598 +0000 UTC m=+324.758252593" watchObservedRunningTime="2026-03-18 14:05:35.644424861 +0000 UTC m=+336.958842856" Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.646601 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55dfcc958b-qmz42" podStartSLOduration=35.646583198 podStartE2EDuration="35.646583198s" podCreationTimestamp="2026-03-18 14:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:05:23.46857588 +0000 UTC m=+324.782993885" watchObservedRunningTime="2026-03-18 14:05:35.646583198 +0000 UTC m=+336.961001183" Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.650819 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-9sqqb"] Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.650881 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.657380 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.675564 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=12.675533927 podStartE2EDuration="12.675533927s" podCreationTimestamp="2026-03-18 14:05:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:05:35.673522633 +0000 UTC m=+336.987940648" watchObservedRunningTime="2026-03-18 14:05:35.675533927 +0000 UTC m=+336.989951972" Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.695361 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.778304 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.858465 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.897928 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.911998 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 14:05:35 crc kubenswrapper[4756]: I0318 14:05:35.989958 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 14:05:36 crc kubenswrapper[4756]: I0318 14:05:36.104896 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 14:05:36 crc kubenswrapper[4756]: I0318 14:05:36.302943 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 14:05:36 crc kubenswrapper[4756]: I0318 14:05:36.315218 4756 scope.go:117] "RemoveContainer" containerID="172409e3d312e48c5a4930e02722e189ba2a62d56ecc4b89efc48c210b010c00" Mar 18 14:05:36 crc kubenswrapper[4756]: I0318 14:05:36.559506 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 14:05:36 crc kubenswrapper[4756]: I0318 14:05:36.559829 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"21352bf870342b442fb4e67af65067854be60c87569f3a4f1d410984d3c6130b"} Mar 18 14:05:36 crc kubenswrapper[4756]: I0318 14:05:36.647593 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 14:05:36 crc kubenswrapper[4756]: I0318 14:05:36.650518 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 14:05:36 crc kubenswrapper[4756]: I0318 14:05:36.650768 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 14:05:36 crc kubenswrapper[4756]: I0318 14:05:36.676815 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 14:05:36 crc kubenswrapper[4756]: I0318 14:05:36.745039 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 14:05:36 crc kubenswrapper[4756]: I0318 14:05:36.911749 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 14:05:36 crc kubenswrapper[4756]: I0318 14:05:36.981704 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.092728 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.165425 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.297366 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.325045 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f5ac66-56e5-4477-ac1b-1ef496242243" path="/var/lib/kubelet/pods/d3f5ac66-56e5-4477-ac1b-1ef496242243/volumes" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.471210 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.562175 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.567349 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.567894 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.567948 4756 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="21352bf870342b442fb4e67af65067854be60c87569f3a4f1d410984d3c6130b" exitCode=255 Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.567983 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"21352bf870342b442fb4e67af65067854be60c87569f3a4f1d410984d3c6130b"} Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.568019 4756 scope.go:117] "RemoveContainer" containerID="172409e3d312e48c5a4930e02722e189ba2a62d56ecc4b89efc48c210b010c00" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.568573 4756 scope.go:117] "RemoveContainer" containerID="21352bf870342b442fb4e67af65067854be60c87569f3a4f1d410984d3c6130b" Mar 18 14:05:37 crc kubenswrapper[4756]: E0318 14:05:37.568969 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.644998 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.654474 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.724198 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.810503 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.890502 4756 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 14:05:37 crc kubenswrapper[4756]: I0318 14:05:37.968933 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.012569 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.019497 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.173316 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.199662 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.211248 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.269922 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.394578 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.407677 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.467650 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.578831 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.583961 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.643091 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.753691 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 14:05:38 crc kubenswrapper[4756]: I0318 14:05:38.774470 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.011231 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.035647 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.119072 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.144962 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.216902 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.304026 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.494690 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.549379 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.605541 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.631989 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.634233 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.901736 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.929434 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 14:05:39 crc kubenswrapper[4756]: I0318 14:05:39.988609 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.015203 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.198883 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.508197 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.516919 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.627138 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.694259 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.833814 4756 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.855907 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.856192 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.856474 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.893276 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.927795 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.951357 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 14:05:40 crc kubenswrapper[4756]: I0318 14:05:40.984566 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 14:05:41 crc kubenswrapper[4756]: I0318 14:05:41.001097 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 14:05:41 crc kubenswrapper[4756]: I0318 14:05:41.119853 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 14:05:41 crc kubenswrapper[4756]: I0318 14:05:41.239485 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 14:05:41 crc kubenswrapper[4756]: I0318 14:05:41.302562 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 14:05:41 crc kubenswrapper[4756]: I0318 14:05:41.314807 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 14:05:41 crc kubenswrapper[4756]: I0318 14:05:41.346936 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 14:05:41 crc kubenswrapper[4756]: I0318 14:05:41.551632 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 14:05:41 crc kubenswrapper[4756]: I0318 14:05:41.666698 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 14:05:41 crc kubenswrapper[4756]: I0318 14:05:41.732164 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 14:05:41 crc kubenswrapper[4756]: I0318 14:05:41.760210 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 14:05:41 crc kubenswrapper[4756]: I0318 14:05:41.847032 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 14:05:41 crc kubenswrapper[4756]: I0318 14:05:41.965235 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 14:05:41 crc kubenswrapper[4756]: I0318 14:05:41.997628 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.039036 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.090518 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.223489 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.234567 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.268791 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.308037 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.329452 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.335181 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.407479 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.463153 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.467478 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.488963 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.694658 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.786890 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.886192 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.894760 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.923235 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.928663 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.958483 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 14:05:42 crc kubenswrapper[4756]: I0318 14:05:42.989296 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.012020 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.040372 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.109251 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.132560 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.190567 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.196472 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.210384 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.210469 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.250323 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.250419 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.320011 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.324753 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.357576 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.418194 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.437803 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.476771 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.511942 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.513250 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-d559d64cf-9r5lw"] Mar 18 14:05:43 crc kubenswrapper[4756]: E0318 14:05:43.513640 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356d2f47-a922-458e-8578-a79ee650e100" containerName="installer" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.513682 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="356d2f47-a922-458e-8578-a79ee650e100" containerName="installer" Mar 18 14:05:43 crc kubenswrapper[4756]: E0318 14:05:43.513711 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f5ac66-56e5-4477-ac1b-1ef496242243" containerName="oauth-openshift" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.513728 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f5ac66-56e5-4477-ac1b-1ef496242243" containerName="oauth-openshift" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.513912 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="356d2f47-a922-458e-8578-a79ee650e100" containerName="installer" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.513942 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f5ac66-56e5-4477-ac1b-1ef496242243" containerName="oauth-openshift" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.514590 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.519346 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.519465 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.519832 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.519877 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.520340 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.520369 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.521523 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.521651 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.521747 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.521867 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.521965 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.525798 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.532919 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.534825 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-user-template-login\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.534880 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-user-template-error\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.534918 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-service-ca\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.534937 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6dv9\" (UniqueName: \"kubernetes.io/projected/6b0c4431-c25b-4746-80f6-dc6f49510afc-kube-api-access-z6dv9\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.534966 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.534983 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-session\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.535017 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.535035 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.535058 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6b0c4431-c25b-4746-80f6-dc6f49510afc-audit-dir\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.535093 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-router-certs\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.535126 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.535171 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.535196 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.535220 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6b0c4431-c25b-4746-80f6-dc6f49510afc-audit-policies\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.537430 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.541478 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.591609 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636231 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636291 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6b0c4431-c25b-4746-80f6-dc6f49510afc-audit-policies\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636335 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-user-template-login\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636370 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-user-template-error\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636427 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-service-ca\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636459 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6dv9\" (UniqueName: \"kubernetes.io/projected/6b0c4431-c25b-4746-80f6-dc6f49510afc-kube-api-access-z6dv9\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636502 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636539 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-session\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636607 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636642 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636679 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6b0c4431-c25b-4746-80f6-dc6f49510afc-audit-dir\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636722 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-router-certs\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636754 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.636784 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.637497 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6b0c4431-c25b-4746-80f6-dc6f49510afc-audit-policies\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.637788 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6b0c4431-c25b-4746-80f6-dc6f49510afc-audit-dir\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.638096 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.638672 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-service-ca\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.639809 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.644675 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.644802 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.646730 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-session\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.647280 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.647847 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-system-router-certs\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.647992 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.648083 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-user-template-login\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.649788 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6b0c4431-c25b-4746-80f6-dc6f49510afc-v4-0-config-user-template-error\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.678694 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6dv9\" (UniqueName: \"kubernetes.io/projected/6b0c4431-c25b-4746-80f6-dc6f49510afc-kube-api-access-z6dv9\") pod \"oauth-openshift-d559d64cf-9r5lw\" (UID: \"6b0c4431-c25b-4746-80f6-dc6f49510afc\") " pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.683701 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.707408 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.716388 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.754958 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.775400 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.807705 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.810492 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.851339 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.944316 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.979137 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.981661 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 14:05:43 crc kubenswrapper[4756]: I0318 14:05:43.992033 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.018076 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.042991 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.049175 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.288374 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.337025 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.448845 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.462606 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.480797 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.540312 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.552827 4756 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.563632 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.603103 4756 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.607329 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.620295 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.620332 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.625509 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.868607 4756 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.885945 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.914822 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 14:05:44 crc kubenswrapper[4756]: I0318 14:05:44.937102 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.006484 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.239019 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.353215 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.533381 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.572430 4756 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.572733 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://c3871e675d7e5c9c7f87dcc384977ce8e00de046736fd7c02ad7c112bb2c61ba" gracePeriod=5 Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.713294 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.775063 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.790580 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.866095 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.885047 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.945657 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.960640 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 14:05:45 crc kubenswrapper[4756]: I0318 14:05:45.990625 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 14:05:46 crc kubenswrapper[4756]: I0318 14:05:46.068228 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 14:05:46 crc kubenswrapper[4756]: I0318 14:05:46.188164 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 14:05:46 crc kubenswrapper[4756]: I0318 14:05:46.226037 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 14:05:46 crc kubenswrapper[4756]: I0318 14:05:46.264965 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 14:05:46 crc kubenswrapper[4756]: I0318 14:05:46.358943 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 14:05:46 crc kubenswrapper[4756]: I0318 14:05:46.378264 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 14:05:46 crc kubenswrapper[4756]: I0318 14:05:46.389329 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 14:05:46 crc kubenswrapper[4756]: I0318 14:05:46.398037 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 14:05:46 crc kubenswrapper[4756]: I0318 14:05:46.561862 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 14:05:46 crc kubenswrapper[4756]: I0318 14:05:46.628188 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 14:05:46 crc kubenswrapper[4756]: I0318 14:05:46.785171 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 14:05:46 crc kubenswrapper[4756]: I0318 14:05:46.837092 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 14:05:46 crc kubenswrapper[4756]: I0318 14:05:46.862012 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.000300 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.049804 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.049806 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.100825 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.233450 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.292077 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.298901 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.309410 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.320662 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.326613 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.331193 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.368532 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.408069 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.473643 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.482394 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.526453 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.548649 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.661929 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.728562 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.747001 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.865289 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.868884 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 14:05:47 crc kubenswrapper[4756]: I0318 14:05:47.996796 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 14:05:48 crc kubenswrapper[4756]: I0318 14:05:48.086423 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 14:05:48 crc kubenswrapper[4756]: I0318 14:05:48.104801 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 14:05:48 crc kubenswrapper[4756]: I0318 14:05:48.132789 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 14:05:48 crc kubenswrapper[4756]: I0318 14:05:48.196846 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 14:05:48 crc kubenswrapper[4756]: I0318 14:05:48.476911 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 14:05:48 crc kubenswrapper[4756]: I0318 14:05:48.604587 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 14:05:48 crc kubenswrapper[4756]: I0318 14:05:48.609387 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 14:05:48 crc kubenswrapper[4756]: I0318 14:05:48.806808 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 14:05:48 crc kubenswrapper[4756]: I0318 14:05:48.851745 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 14:05:48 crc kubenswrapper[4756]: I0318 14:05:48.992598 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 14:05:49 crc kubenswrapper[4756]: I0318 14:05:49.320135 4756 scope.go:117] "RemoveContainer" containerID="21352bf870342b442fb4e67af65067854be60c87569f3a4f1d410984d3c6130b" Mar 18 14:05:49 crc kubenswrapper[4756]: E0318 14:05:49.321507 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 14:05:49 crc kubenswrapper[4756]: I0318 14:05:49.352677 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 14:05:49 crc kubenswrapper[4756]: I0318 14:05:49.616174 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 14:05:49 crc kubenswrapper[4756]: I0318 14:05:49.635329 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 14:05:49 crc kubenswrapper[4756]: I0318 14:05:49.651138 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 14:05:49 crc kubenswrapper[4756]: I0318 14:05:49.913379 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 14:05:50 crc kubenswrapper[4756]: I0318 14:05:50.035928 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 14:05:50 crc kubenswrapper[4756]: I0318 14:05:50.069630 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 14:05:50 crc kubenswrapper[4756]: I0318 14:05:50.438294 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 14:05:50 crc kubenswrapper[4756]: I0318 14:05:50.528962 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 14:05:50 crc kubenswrapper[4756]: I0318 14:05:50.647825 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 14:05:50 crc kubenswrapper[4756]: I0318 14:05:50.647866 4756 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="c3871e675d7e5c9c7f87dcc384977ce8e00de046736fd7c02ad7c112bb2c61ba" exitCode=137 Mar 18 14:05:50 crc kubenswrapper[4756]: I0318 14:05:50.679011 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 14:05:50 crc kubenswrapper[4756]: I0318 14:05:50.692912 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 14:05:50 crc kubenswrapper[4756]: I0318 14:05:50.913944 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 14:05:50 crc kubenswrapper[4756]: I0318 14:05:50.953818 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.037751 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-d559d64cf-9r5lw"] Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.152748 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.179961 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.180037 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.193371 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.235313 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.235455 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.235515 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.235540 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.235569 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.235863 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.235904 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.235930 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.235937 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.247847 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.323504 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.323838 4756 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.333906 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.333953 4756 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="fbf0b910-7774-4e53-ac74-175dd8cb6b3f" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.337060 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.337094 4756 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="fbf0b910-7774-4e53-ac74-175dd8cb6b3f" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.337435 4756 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.337495 4756 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.337522 4756 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.337545 4756 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.337568 4756 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.381775 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.435353 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-d559d64cf-9r5lw"] Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.539466 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.655456 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.655953 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.655980 4756 scope.go:117] "RemoveContainer" containerID="c3871e675d7e5c9c7f87dcc384977ce8e00de046736fd7c02ad7c112bb2c61ba" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.657689 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 14:05:51 crc kubenswrapper[4756]: I0318 14:05:51.659368 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" event={"ID":"6b0c4431-c25b-4746-80f6-dc6f49510afc","Type":"ContainerStarted","Data":"da6a7d68852333e04f59c155fa80d834251512062826515c1ba4814eb8c044b4"} Mar 18 14:05:52 crc kubenswrapper[4756]: I0318 14:05:52.666227 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" event={"ID":"6b0c4431-c25b-4746-80f6-dc6f49510afc","Type":"ContainerStarted","Data":"b8b1153fd489f46ca5653ab076dd4b39ae81dac10b5ef07c1e88d068302b9f7b"} Mar 18 14:05:52 crc kubenswrapper[4756]: I0318 14:05:52.667106 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:52 crc kubenswrapper[4756]: I0318 14:05:52.672782 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" Mar 18 14:05:52 crc kubenswrapper[4756]: I0318 14:05:52.687779 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-d559d64cf-9r5lw" podStartSLOduration=59.687762847 podStartE2EDuration="59.687762847s" podCreationTimestamp="2026-03-18 14:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:05:52.68602755 +0000 UTC m=+354.000445535" watchObservedRunningTime="2026-03-18 14:05:52.687762847 +0000 UTC m=+354.002180832" Mar 18 14:05:52 crc kubenswrapper[4756]: I0318 14:05:52.897331 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.176481 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564046-6bpkl"] Mar 18 14:06:00 crc kubenswrapper[4756]: E0318 14:06:00.177486 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.177520 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.177715 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.178375 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564046-6bpkl" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.180207 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.180835 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.181073 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.183176 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564046-6bpkl"] Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.315459 4756 scope.go:117] "RemoveContainer" containerID="21352bf870342b442fb4e67af65067854be60c87569f3a4f1d410984d3c6130b" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.353385 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtvpn\" (UniqueName: \"kubernetes.io/projected/abe405f4-7a33-48cb-be35-37815487343f-kube-api-access-wtvpn\") pod \"auto-csr-approver-29564046-6bpkl\" (UID: \"abe405f4-7a33-48cb-be35-37815487343f\") " pod="openshift-infra/auto-csr-approver-29564046-6bpkl" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.454600 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtvpn\" (UniqueName: \"kubernetes.io/projected/abe405f4-7a33-48cb-be35-37815487343f-kube-api-access-wtvpn\") pod \"auto-csr-approver-29564046-6bpkl\" (UID: \"abe405f4-7a33-48cb-be35-37815487343f\") " pod="openshift-infra/auto-csr-approver-29564046-6bpkl" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.474255 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtvpn\" (UniqueName: \"kubernetes.io/projected/abe405f4-7a33-48cb-be35-37815487343f-kube-api-access-wtvpn\") pod \"auto-csr-approver-29564046-6bpkl\" (UID: \"abe405f4-7a33-48cb-be35-37815487343f\") " pod="openshift-infra/auto-csr-approver-29564046-6bpkl" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.493237 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564046-6bpkl" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.721377 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.721488 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6eb8f0ae256fdd845738af4aec40190ed70f82b9e6a4c824b66fa99af87ebe12"} Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.744689 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 14:06:00 crc kubenswrapper[4756]: I0318 14:06:00.888618 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564046-6bpkl"] Mar 18 14:06:00 crc kubenswrapper[4756]: W0318 14:06:00.897328 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabe405f4_7a33_48cb_be35_37815487343f.slice/crio-ce12d7c75876fd0db25b686ada991af903dcf85a7250ab4e8c063d516db87461 WatchSource:0}: Error finding container ce12d7c75876fd0db25b686ada991af903dcf85a7250ab4e8c063d516db87461: Status 404 returned error can't find the container with id ce12d7c75876fd0db25b686ada991af903dcf85a7250ab4e8c063d516db87461 Mar 18 14:06:01 crc kubenswrapper[4756]: I0318 14:06:01.732354 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564046-6bpkl" event={"ID":"abe405f4-7a33-48cb-be35-37815487343f","Type":"ContainerStarted","Data":"ce12d7c75876fd0db25b686ada991af903dcf85a7250ab4e8c063d516db87461"} Mar 18 14:06:02 crc kubenswrapper[4756]: I0318 14:06:02.740032 4756 generic.go:334] "Generic (PLEG): container finished" podID="abe405f4-7a33-48cb-be35-37815487343f" containerID="786b7accb599d44b74db20cc44d7928d481a7f1829c31af618ebf0b67f372ddd" exitCode=0 Mar 18 14:06:02 crc kubenswrapper[4756]: I0318 14:06:02.740199 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564046-6bpkl" event={"ID":"abe405f4-7a33-48cb-be35-37815487343f","Type":"ContainerDied","Data":"786b7accb599d44b74db20cc44d7928d481a7f1829c31af618ebf0b67f372ddd"} Mar 18 14:06:04 crc kubenswrapper[4756]: I0318 14:06:04.038253 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564046-6bpkl" Mar 18 14:06:04 crc kubenswrapper[4756]: I0318 14:06:04.201084 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtvpn\" (UniqueName: \"kubernetes.io/projected/abe405f4-7a33-48cb-be35-37815487343f-kube-api-access-wtvpn\") pod \"abe405f4-7a33-48cb-be35-37815487343f\" (UID: \"abe405f4-7a33-48cb-be35-37815487343f\") " Mar 18 14:06:04 crc kubenswrapper[4756]: I0318 14:06:04.207137 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe405f4-7a33-48cb-be35-37815487343f-kube-api-access-wtvpn" (OuterVolumeSpecName: "kube-api-access-wtvpn") pod "abe405f4-7a33-48cb-be35-37815487343f" (UID: "abe405f4-7a33-48cb-be35-37815487343f"). InnerVolumeSpecName "kube-api-access-wtvpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:06:04 crc kubenswrapper[4756]: I0318 14:06:04.302457 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtvpn\" (UniqueName: \"kubernetes.io/projected/abe405f4-7a33-48cb-be35-37815487343f-kube-api-access-wtvpn\") on node \"crc\" DevicePath \"\"" Mar 18 14:06:04 crc kubenswrapper[4756]: I0318 14:06:04.755778 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564046-6bpkl" event={"ID":"abe405f4-7a33-48cb-be35-37815487343f","Type":"ContainerDied","Data":"ce12d7c75876fd0db25b686ada991af903dcf85a7250ab4e8c063d516db87461"} Mar 18 14:06:04 crc kubenswrapper[4756]: I0318 14:06:04.755836 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce12d7c75876fd0db25b686ada991af903dcf85a7250ab4e8c063d516db87461" Mar 18 14:06:04 crc kubenswrapper[4756]: I0318 14:06:04.755867 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564046-6bpkl" Mar 18 14:06:10 crc kubenswrapper[4756]: I0318 14:06:10.792047 4756 generic.go:334] "Generic (PLEG): container finished" podID="487d1c97-b703-4f1b-8c77-c23b4366a467" containerID="08d1ec45761c189938b8c75e287b3fa3027fce6b571256842d1be256be20d3d3" exitCode=0 Mar 18 14:06:10 crc kubenswrapper[4756]: I0318 14:06:10.792161 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" event={"ID":"487d1c97-b703-4f1b-8c77-c23b4366a467","Type":"ContainerDied","Data":"08d1ec45761c189938b8c75e287b3fa3027fce6b571256842d1be256be20d3d3"} Mar 18 14:06:10 crc kubenswrapper[4756]: I0318 14:06:10.793059 4756 scope.go:117] "RemoveContainer" containerID="08d1ec45761c189938b8c75e287b3fa3027fce6b571256842d1be256be20d3d3" Mar 18 14:06:11 crc kubenswrapper[4756]: I0318 14:06:11.802294 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" event={"ID":"487d1c97-b703-4f1b-8c77-c23b4366a467","Type":"ContainerStarted","Data":"a8c7eca6a9bb3103aa3afdd50a9beeb16b83bc73ee47336dc20fa3c42bd3d4cd"} Mar 18 14:06:11 crc kubenswrapper[4756]: I0318 14:06:11.802983 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:06:11 crc kubenswrapper[4756]: I0318 14:06:11.806245 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.257493 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-prb7j"] Mar 18 14:06:52 crc kubenswrapper[4756]: E0318 14:06:52.258285 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe405f4-7a33-48cb-be35-37815487343f" containerName="oc" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.258300 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe405f4-7a33-48cb-be35-37815487343f" containerName="oc" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.258417 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe405f4-7a33-48cb-be35-37815487343f" containerName="oc" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.259014 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.274386 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-prb7j"] Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.350065 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/05387aad-4632-43b4-a910-3b2b55b8918d-registry-certificates\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.350147 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/05387aad-4632-43b4-a910-3b2b55b8918d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.350210 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05387aad-4632-43b4-a910-3b2b55b8918d-trusted-ca\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.350255 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/05387aad-4632-43b4-a910-3b2b55b8918d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.350302 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.350353 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05387aad-4632-43b4-a910-3b2b55b8918d-bound-sa-token\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.350419 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/05387aad-4632-43b4-a910-3b2b55b8918d-registry-tls\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.350443 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xwq9\" (UniqueName: \"kubernetes.io/projected/05387aad-4632-43b4-a910-3b2b55b8918d-kube-api-access-5xwq9\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.369002 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.451341 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05387aad-4632-43b4-a910-3b2b55b8918d-trusted-ca\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.451421 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/05387aad-4632-43b4-a910-3b2b55b8918d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.451452 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05387aad-4632-43b4-a910-3b2b55b8918d-bound-sa-token\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.451483 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/05387aad-4632-43b4-a910-3b2b55b8918d-registry-tls\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.451506 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xwq9\" (UniqueName: \"kubernetes.io/projected/05387aad-4632-43b4-a910-3b2b55b8918d-kube-api-access-5xwq9\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.451542 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/05387aad-4632-43b4-a910-3b2b55b8918d-registry-certificates\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.451575 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/05387aad-4632-43b4-a910-3b2b55b8918d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.452351 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/05387aad-4632-43b4-a910-3b2b55b8918d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.453182 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/05387aad-4632-43b4-a910-3b2b55b8918d-registry-certificates\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.453412 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05387aad-4632-43b4-a910-3b2b55b8918d-trusted-ca\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.458521 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/05387aad-4632-43b4-a910-3b2b55b8918d-registry-tls\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.459425 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/05387aad-4632-43b4-a910-3b2b55b8918d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.468316 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xwq9\" (UniqueName: \"kubernetes.io/projected/05387aad-4632-43b4-a910-3b2b55b8918d-kube-api-access-5xwq9\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.476175 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05387aad-4632-43b4-a910-3b2b55b8918d-bound-sa-token\") pod \"image-registry-66df7c8f76-prb7j\" (UID: \"05387aad-4632-43b4-a910-3b2b55b8918d\") " pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:52 crc kubenswrapper[4756]: I0318 14:06:52.580001 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:53 crc kubenswrapper[4756]: I0318 14:06:53.054608 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-prb7j"] Mar 18 14:06:54 crc kubenswrapper[4756]: I0318 14:06:54.063929 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" event={"ID":"05387aad-4632-43b4-a910-3b2b55b8918d","Type":"ContainerStarted","Data":"de457b074f7b9586f5ec48f87d6b441fa995a9f2e143dcb4a30efa9e1e73b71c"} Mar 18 14:06:54 crc kubenswrapper[4756]: I0318 14:06:54.065392 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" event={"ID":"05387aad-4632-43b4-a910-3b2b55b8918d","Type":"ContainerStarted","Data":"216ffb9958e23a41775112e7ffcd1fea1257daf69812ff15b4271ac8dba87a87"} Mar 18 14:06:54 crc kubenswrapper[4756]: I0318 14:06:54.065450 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:06:54 crc kubenswrapper[4756]: I0318 14:06:54.099718 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" podStartSLOduration=2.099678381 podStartE2EDuration="2.099678381s" podCreationTimestamp="2026-03-18 14:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:06:54.0941035 +0000 UTC m=+415.408521475" watchObservedRunningTime="2026-03-18 14:06:54.099678381 +0000 UTC m=+415.414096406" Mar 18 14:07:12 crc kubenswrapper[4756]: I0318 14:07:12.588506 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-prb7j" Mar 18 14:07:12 crc kubenswrapper[4756]: I0318 14:07:12.677408 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9pdb4"] Mar 18 14:07:19 crc kubenswrapper[4756]: I0318 14:07:19.874793 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w28cp"] Mar 18 14:07:19 crc kubenswrapper[4756]: I0318 14:07:19.875609 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w28cp" podUID="7bb3189f-716d-4fef-b885-3a031a60d981" containerName="registry-server" containerID="cri-o://28d969f89913a9a030750c4a9dd024178f31c8822a56ad7c3e58adbfb440881a" gracePeriod=30 Mar 18 14:07:19 crc kubenswrapper[4756]: I0318 14:07:19.881892 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lkr5r"] Mar 18 14:07:19 crc kubenswrapper[4756]: I0318 14:07:19.894873 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4ckd5"] Mar 18 14:07:19 crc kubenswrapper[4756]: I0318 14:07:19.895150 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" podUID="487d1c97-b703-4f1b-8c77-c23b4366a467" containerName="marketplace-operator" containerID="cri-o://a8c7eca6a9bb3103aa3afdd50a9beeb16b83bc73ee47336dc20fa3c42bd3d4cd" gracePeriod=30 Mar 18 14:07:19 crc kubenswrapper[4756]: I0318 14:07:19.899470 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hrhl"] Mar 18 14:07:19 crc kubenswrapper[4756]: I0318 14:07:19.899709 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9hrhl" podUID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" containerName="registry-server" containerID="cri-o://e26f416eeb1167ba699cba6d84e1fedae84f98109849c081a527ee30b31a601c" gracePeriod=30 Mar 18 14:07:19 crc kubenswrapper[4756]: I0318 14:07:19.914629 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gs6l5"] Mar 18 14:07:19 crc kubenswrapper[4756]: I0318 14:07:19.914887 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gs6l5" podUID="af462049-61c3-4da5-aeb0-0311404c4741" containerName="registry-server" containerID="cri-o://0eca24fe5bac3c631ba88bd97407443f30d61316ccb50ad843025b68bf12b372" gracePeriod=30 Mar 18 14:07:19 crc kubenswrapper[4756]: I0318 14:07:19.916548 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45k2m"] Mar 18 14:07:19 crc kubenswrapper[4756]: I0318 14:07:19.917271 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" Mar 18 14:07:19 crc kubenswrapper[4756]: I0318 14:07:19.929735 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45k2m"] Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.071253 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5669z\" (UniqueName: \"kubernetes.io/projected/df050fc3-f811-40d6-a005-b9dc7062fdf5-kube-api-access-5669z\") pod \"marketplace-operator-79b997595-45k2m\" (UID: \"df050fc3-f811-40d6-a005-b9dc7062fdf5\") " pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.071327 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df050fc3-f811-40d6-a005-b9dc7062fdf5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45k2m\" (UID: \"df050fc3-f811-40d6-a005-b9dc7062fdf5\") " pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.071372 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df050fc3-f811-40d6-a005-b9dc7062fdf5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45k2m\" (UID: \"df050fc3-f811-40d6-a005-b9dc7062fdf5\") " pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.172640 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5669z\" (UniqueName: \"kubernetes.io/projected/df050fc3-f811-40d6-a005-b9dc7062fdf5-kube-api-access-5669z\") pod \"marketplace-operator-79b997595-45k2m\" (UID: \"df050fc3-f811-40d6-a005-b9dc7062fdf5\") " pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.172929 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df050fc3-f811-40d6-a005-b9dc7062fdf5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45k2m\" (UID: \"df050fc3-f811-40d6-a005-b9dc7062fdf5\") " pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.172959 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df050fc3-f811-40d6-a005-b9dc7062fdf5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45k2m\" (UID: \"df050fc3-f811-40d6-a005-b9dc7062fdf5\") " pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.179058 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df050fc3-f811-40d6-a005-b9dc7062fdf5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45k2m\" (UID: \"df050fc3-f811-40d6-a005-b9dc7062fdf5\") " pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.182357 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df050fc3-f811-40d6-a005-b9dc7062fdf5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45k2m\" (UID: \"df050fc3-f811-40d6-a005-b9dc7062fdf5\") " pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.189700 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5669z\" (UniqueName: \"kubernetes.io/projected/df050fc3-f811-40d6-a005-b9dc7062fdf5-kube-api-access-5669z\") pod \"marketplace-operator-79b997595-45k2m\" (UID: \"df050fc3-f811-40d6-a005-b9dc7062fdf5\") " pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.225381 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.239804 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.241658 4756 generic.go:334] "Generic (PLEG): container finished" podID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" containerID="e26f416eeb1167ba699cba6d84e1fedae84f98109849c081a527ee30b31a601c" exitCode=0 Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.241727 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hrhl" event={"ID":"caee1439-b7bb-456e-982f-1c3c3cdb51c3","Type":"ContainerDied","Data":"e26f416eeb1167ba699cba6d84e1fedae84f98109849c081a527ee30b31a601c"} Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.244647 4756 generic.go:334] "Generic (PLEG): container finished" podID="7bb3189f-716d-4fef-b885-3a031a60d981" containerID="28d969f89913a9a030750c4a9dd024178f31c8822a56ad7c3e58adbfb440881a" exitCode=0 Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.244742 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w28cp" event={"ID":"7bb3189f-716d-4fef-b885-3a031a60d981","Type":"ContainerDied","Data":"28d969f89913a9a030750c4a9dd024178f31c8822a56ad7c3e58adbfb440881a"} Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.244796 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w28cp" event={"ID":"7bb3189f-716d-4fef-b885-3a031a60d981","Type":"ContainerDied","Data":"aece27eb672ef310f383f576b2abfc62b69117204b72ffe1273500fce32133f4"} Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.244816 4756 scope.go:117] "RemoveContainer" containerID="28d969f89913a9a030750c4a9dd024178f31c8822a56ad7c3e58adbfb440881a" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.245020 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w28cp" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.249094 4756 generic.go:334] "Generic (PLEG): container finished" podID="487d1c97-b703-4f1b-8c77-c23b4366a467" containerID="a8c7eca6a9bb3103aa3afdd50a9beeb16b83bc73ee47336dc20fa3c42bd3d4cd" exitCode=0 Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.249157 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" event={"ID":"487d1c97-b703-4f1b-8c77-c23b4366a467","Type":"ContainerDied","Data":"a8c7eca6a9bb3103aa3afdd50a9beeb16b83bc73ee47336dc20fa3c42bd3d4cd"} Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.251688 4756 generic.go:334] "Generic (PLEG): container finished" podID="af462049-61c3-4da5-aeb0-0311404c4741" containerID="0eca24fe5bac3c631ba88bd97407443f30d61316ccb50ad843025b68bf12b372" exitCode=0 Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.251875 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lkr5r" podUID="48602255-9809-498e-9c4a-6053ba5ff591" containerName="registry-server" containerID="cri-o://510acfd7467e9c19579eba546928e683416362c816d3015cf2462f9f4c10361a" gracePeriod=30 Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.251847 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs6l5" event={"ID":"af462049-61c3-4da5-aeb0-0311404c4741","Type":"ContainerDied","Data":"0eca24fe5bac3c631ba88bd97407443f30d61316ccb50ad843025b68bf12b372"} Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.278357 4756 scope.go:117] "RemoveContainer" containerID="58699ce81b73c57dccbc44b3771f71b4b535826b137835d929dc4811ddc3d10d" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.374777 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22d5s\" (UniqueName: \"kubernetes.io/projected/7bb3189f-716d-4fef-b885-3a031a60d981-kube-api-access-22d5s\") pod \"7bb3189f-716d-4fef-b885-3a031a60d981\" (UID: \"7bb3189f-716d-4fef-b885-3a031a60d981\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.374837 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb3189f-716d-4fef-b885-3a031a60d981-utilities\") pod \"7bb3189f-716d-4fef-b885-3a031a60d981\" (UID: \"7bb3189f-716d-4fef-b885-3a031a60d981\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.374898 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb3189f-716d-4fef-b885-3a031a60d981-catalog-content\") pod \"7bb3189f-716d-4fef-b885-3a031a60d981\" (UID: \"7bb3189f-716d-4fef-b885-3a031a60d981\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.376201 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb3189f-716d-4fef-b885-3a031a60d981-utilities" (OuterVolumeSpecName: "utilities") pod "7bb3189f-716d-4fef-b885-3a031a60d981" (UID: "7bb3189f-716d-4fef-b885-3a031a60d981"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.376461 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb3189f-716d-4fef-b885-3a031a60d981-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.380227 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb3189f-716d-4fef-b885-3a031a60d981-kube-api-access-22d5s" (OuterVolumeSpecName: "kube-api-access-22d5s") pod "7bb3189f-716d-4fef-b885-3a031a60d981" (UID: "7bb3189f-716d-4fef-b885-3a031a60d981"). InnerVolumeSpecName "kube-api-access-22d5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.385611 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.393271 4756 scope.go:117] "RemoveContainer" containerID="00fab9a64615b7815712deba913c7cad59eca83b4fbd4e5ebb424ff868e4da66" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.406669 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.413837 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.447733 4756 scope.go:117] "RemoveContainer" containerID="28d969f89913a9a030750c4a9dd024178f31c8822a56ad7c3e58adbfb440881a" Mar 18 14:07:20 crc kubenswrapper[4756]: E0318 14:07:20.448322 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d969f89913a9a030750c4a9dd024178f31c8822a56ad7c3e58adbfb440881a\": container with ID starting with 28d969f89913a9a030750c4a9dd024178f31c8822a56ad7c3e58adbfb440881a not found: ID does not exist" containerID="28d969f89913a9a030750c4a9dd024178f31c8822a56ad7c3e58adbfb440881a" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.448371 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d969f89913a9a030750c4a9dd024178f31c8822a56ad7c3e58adbfb440881a"} err="failed to get container status \"28d969f89913a9a030750c4a9dd024178f31c8822a56ad7c3e58adbfb440881a\": rpc error: code = NotFound desc = could not find container \"28d969f89913a9a030750c4a9dd024178f31c8822a56ad7c3e58adbfb440881a\": container with ID starting with 28d969f89913a9a030750c4a9dd024178f31c8822a56ad7c3e58adbfb440881a not found: ID does not exist" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.448397 4756 scope.go:117] "RemoveContainer" containerID="58699ce81b73c57dccbc44b3771f71b4b535826b137835d929dc4811ddc3d10d" Mar 18 14:07:20 crc kubenswrapper[4756]: E0318 14:07:20.448876 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58699ce81b73c57dccbc44b3771f71b4b535826b137835d929dc4811ddc3d10d\": container with ID starting with 58699ce81b73c57dccbc44b3771f71b4b535826b137835d929dc4811ddc3d10d not found: ID does not exist" containerID="58699ce81b73c57dccbc44b3771f71b4b535826b137835d929dc4811ddc3d10d" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.448930 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58699ce81b73c57dccbc44b3771f71b4b535826b137835d929dc4811ddc3d10d"} err="failed to get container status \"58699ce81b73c57dccbc44b3771f71b4b535826b137835d929dc4811ddc3d10d\": rpc error: code = NotFound desc = could not find container \"58699ce81b73c57dccbc44b3771f71b4b535826b137835d929dc4811ddc3d10d\": container with ID starting with 58699ce81b73c57dccbc44b3771f71b4b535826b137835d929dc4811ddc3d10d not found: ID does not exist" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.448950 4756 scope.go:117] "RemoveContainer" containerID="00fab9a64615b7815712deba913c7cad59eca83b4fbd4e5ebb424ff868e4da66" Mar 18 14:07:20 crc kubenswrapper[4756]: E0318 14:07:20.449305 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00fab9a64615b7815712deba913c7cad59eca83b4fbd4e5ebb424ff868e4da66\": container with ID starting with 00fab9a64615b7815712deba913c7cad59eca83b4fbd4e5ebb424ff868e4da66 not found: ID does not exist" containerID="00fab9a64615b7815712deba913c7cad59eca83b4fbd4e5ebb424ff868e4da66" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.449343 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00fab9a64615b7815712deba913c7cad59eca83b4fbd4e5ebb424ff868e4da66"} err="failed to get container status \"00fab9a64615b7815712deba913c7cad59eca83b4fbd4e5ebb424ff868e4da66\": rpc error: code = NotFound desc = could not find container \"00fab9a64615b7815712deba913c7cad59eca83b4fbd4e5ebb424ff868e4da66\": container with ID starting with 00fab9a64615b7815712deba913c7cad59eca83b4fbd4e5ebb424ff868e4da66 not found: ID does not exist" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.449383 4756 scope.go:117] "RemoveContainer" containerID="08d1ec45761c189938b8c75e287b3fa3027fce6b571256842d1be256be20d3d3" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.478527 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/487d1c97-b703-4f1b-8c77-c23b4366a467-marketplace-trusted-ca\") pod \"487d1c97-b703-4f1b-8c77-c23b4366a467\" (UID: \"487d1c97-b703-4f1b-8c77-c23b4366a467\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.478590 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjgd5\" (UniqueName: \"kubernetes.io/projected/487d1c97-b703-4f1b-8c77-c23b4366a467-kube-api-access-zjgd5\") pod \"487d1c97-b703-4f1b-8c77-c23b4366a467\" (UID: \"487d1c97-b703-4f1b-8c77-c23b4366a467\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.478636 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af462049-61c3-4da5-aeb0-0311404c4741-utilities\") pod \"af462049-61c3-4da5-aeb0-0311404c4741\" (UID: \"af462049-61c3-4da5-aeb0-0311404c4741\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.478829 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22d5s\" (UniqueName: \"kubernetes.io/projected/7bb3189f-716d-4fef-b885-3a031a60d981-kube-api-access-22d5s\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.479508 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af462049-61c3-4da5-aeb0-0311404c4741-utilities" (OuterVolumeSpecName: "utilities") pod "af462049-61c3-4da5-aeb0-0311404c4741" (UID: "af462049-61c3-4da5-aeb0-0311404c4741"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.480268 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/487d1c97-b703-4f1b-8c77-c23b4366a467-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "487d1c97-b703-4f1b-8c77-c23b4366a467" (UID: "487d1c97-b703-4f1b-8c77-c23b4366a467"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.483916 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487d1c97-b703-4f1b-8c77-c23b4366a467-kube-api-access-zjgd5" (OuterVolumeSpecName: "kube-api-access-zjgd5") pod "487d1c97-b703-4f1b-8c77-c23b4366a467" (UID: "487d1c97-b703-4f1b-8c77-c23b4366a467"). InnerVolumeSpecName "kube-api-access-zjgd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.502643 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb3189f-716d-4fef-b885-3a031a60d981-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7bb3189f-716d-4fef-b885-3a031a60d981" (UID: "7bb3189f-716d-4fef-b885-3a031a60d981"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.574290 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w28cp"] Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.578243 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w28cp"] Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.579825 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56d8b\" (UniqueName: \"kubernetes.io/projected/caee1439-b7bb-456e-982f-1c3c3cdb51c3-kube-api-access-56d8b\") pod \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\" (UID: \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.579873 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af462049-61c3-4da5-aeb0-0311404c4741-catalog-content\") pod \"af462049-61c3-4da5-aeb0-0311404c4741\" (UID: \"af462049-61c3-4da5-aeb0-0311404c4741\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.579902 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caee1439-b7bb-456e-982f-1c3c3cdb51c3-utilities\") pod \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\" (UID: \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.579932 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jcxq\" (UniqueName: \"kubernetes.io/projected/af462049-61c3-4da5-aeb0-0311404c4741-kube-api-access-8jcxq\") pod \"af462049-61c3-4da5-aeb0-0311404c4741\" (UID: \"af462049-61c3-4da5-aeb0-0311404c4741\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.579997 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/487d1c97-b703-4f1b-8c77-c23b4366a467-marketplace-operator-metrics\") pod \"487d1c97-b703-4f1b-8c77-c23b4366a467\" (UID: \"487d1c97-b703-4f1b-8c77-c23b4366a467\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.580038 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caee1439-b7bb-456e-982f-1c3c3cdb51c3-catalog-content\") pod \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\" (UID: \"caee1439-b7bb-456e-982f-1c3c3cdb51c3\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.580327 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb3189f-716d-4fef-b885-3a031a60d981-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.580345 4756 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/487d1c97-b703-4f1b-8c77-c23b4366a467-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.580356 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjgd5\" (UniqueName: \"kubernetes.io/projected/487d1c97-b703-4f1b-8c77-c23b4366a467-kube-api-access-zjgd5\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.580365 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af462049-61c3-4da5-aeb0-0311404c4741-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.581743 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caee1439-b7bb-456e-982f-1c3c3cdb51c3-utilities" (OuterVolumeSpecName: "utilities") pod "caee1439-b7bb-456e-982f-1c3c3cdb51c3" (UID: "caee1439-b7bb-456e-982f-1c3c3cdb51c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.584366 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487d1c97-b703-4f1b-8c77-c23b4366a467-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "487d1c97-b703-4f1b-8c77-c23b4366a467" (UID: "487d1c97-b703-4f1b-8c77-c23b4366a467"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.584404 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caee1439-b7bb-456e-982f-1c3c3cdb51c3-kube-api-access-56d8b" (OuterVolumeSpecName: "kube-api-access-56d8b") pod "caee1439-b7bb-456e-982f-1c3c3cdb51c3" (UID: "caee1439-b7bb-456e-982f-1c3c3cdb51c3"). InnerVolumeSpecName "kube-api-access-56d8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.585414 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af462049-61c3-4da5-aeb0-0311404c4741-kube-api-access-8jcxq" (OuterVolumeSpecName: "kube-api-access-8jcxq") pod "af462049-61c3-4da5-aeb0-0311404c4741" (UID: "af462049-61c3-4da5-aeb0-0311404c4741"). InnerVolumeSpecName "kube-api-access-8jcxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.616913 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caee1439-b7bb-456e-982f-1c3c3cdb51c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "caee1439-b7bb-456e-982f-1c3c3cdb51c3" (UID: "caee1439-b7bb-456e-982f-1c3c3cdb51c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.631455 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.681755 4756 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/487d1c97-b703-4f1b-8c77-c23b4366a467-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.681797 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caee1439-b7bb-456e-982f-1c3c3cdb51c3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.681809 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56d8b\" (UniqueName: \"kubernetes.io/projected/caee1439-b7bb-456e-982f-1c3c3cdb51c3-kube-api-access-56d8b\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.681822 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caee1439-b7bb-456e-982f-1c3c3cdb51c3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.681835 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jcxq\" (UniqueName: \"kubernetes.io/projected/af462049-61c3-4da5-aeb0-0311404c4741-kube-api-access-8jcxq\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.723782 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45k2m"] Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.724439 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af462049-61c3-4da5-aeb0-0311404c4741-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af462049-61c3-4da5-aeb0-0311404c4741" (UID: "af462049-61c3-4da5-aeb0-0311404c4741"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.782860 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48602255-9809-498e-9c4a-6053ba5ff591-utilities\") pod \"48602255-9809-498e-9c4a-6053ba5ff591\" (UID: \"48602255-9809-498e-9c4a-6053ba5ff591\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.782910 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb2h9\" (UniqueName: \"kubernetes.io/projected/48602255-9809-498e-9c4a-6053ba5ff591-kube-api-access-zb2h9\") pod \"48602255-9809-498e-9c4a-6053ba5ff591\" (UID: \"48602255-9809-498e-9c4a-6053ba5ff591\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.782931 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48602255-9809-498e-9c4a-6053ba5ff591-catalog-content\") pod \"48602255-9809-498e-9c4a-6053ba5ff591\" (UID: \"48602255-9809-498e-9c4a-6053ba5ff591\") " Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.783253 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af462049-61c3-4da5-aeb0-0311404c4741-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.784447 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48602255-9809-498e-9c4a-6053ba5ff591-utilities" (OuterVolumeSpecName: "utilities") pod "48602255-9809-498e-9c4a-6053ba5ff591" (UID: "48602255-9809-498e-9c4a-6053ba5ff591"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.787331 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48602255-9809-498e-9c4a-6053ba5ff591-kube-api-access-zb2h9" (OuterVolumeSpecName: "kube-api-access-zb2h9") pod "48602255-9809-498e-9c4a-6053ba5ff591" (UID: "48602255-9809-498e-9c4a-6053ba5ff591"). InnerVolumeSpecName "kube-api-access-zb2h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.830600 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48602255-9809-498e-9c4a-6053ba5ff591-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48602255-9809-498e-9c4a-6053ba5ff591" (UID: "48602255-9809-498e-9c4a-6053ba5ff591"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.885220 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48602255-9809-498e-9c4a-6053ba5ff591-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.885258 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb2h9\" (UniqueName: \"kubernetes.io/projected/48602255-9809-498e-9c4a-6053ba5ff591-kube-api-access-zb2h9\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:20 crc kubenswrapper[4756]: I0318 14:07:20.885669 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48602255-9809-498e-9c4a-6053ba5ff591-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.259568 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hrhl" event={"ID":"caee1439-b7bb-456e-982f-1c3c3cdb51c3","Type":"ContainerDied","Data":"cfa3784752dbd4524d3b17b32e4236210fa92fb35e5cfc9795621d27bd778db2"} Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.259837 4756 scope.go:117] "RemoveContainer" containerID="e26f416eeb1167ba699cba6d84e1fedae84f98109849c081a527ee30b31a601c" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.259587 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hrhl" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.261787 4756 generic.go:334] "Generic (PLEG): container finished" podID="48602255-9809-498e-9c4a-6053ba5ff591" containerID="510acfd7467e9c19579eba546928e683416362c816d3015cf2462f9f4c10361a" exitCode=0 Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.261851 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lkr5r" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.261869 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkr5r" event={"ID":"48602255-9809-498e-9c4a-6053ba5ff591","Type":"ContainerDied","Data":"510acfd7467e9c19579eba546928e683416362c816d3015cf2462f9f4c10361a"} Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.261905 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkr5r" event={"ID":"48602255-9809-498e-9c4a-6053ba5ff591","Type":"ContainerDied","Data":"2fe1715780a79b3516eaae22ae842cab92e5738b865a1b523a5d20ba2fbc93cf"} Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.264223 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" event={"ID":"df050fc3-f811-40d6-a005-b9dc7062fdf5","Type":"ContainerStarted","Data":"e37e650e6bc3bf8f1d1b883dd73f3222c2cdf0c416b57c4cd94117bb7af580c6"} Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.264364 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" event={"ID":"df050fc3-f811-40d6-a005-b9dc7062fdf5","Type":"ContainerStarted","Data":"03026aebc33901f4b1c53329e4f210170515ac74f610af0374d08c173e6a952f"} Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.264532 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.266346 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.266346 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4ckd5" event={"ID":"487d1c97-b703-4f1b-8c77-c23b4366a467","Type":"ContainerDied","Data":"7bda1623aa31ed92a9284992313e5eedb4347d81950af29a664bcc7c08f568d9"} Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.268612 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs6l5" event={"ID":"af462049-61c3-4da5-aeb0-0311404c4741","Type":"ContainerDied","Data":"d5029652368aa8e800239e9ab67e0977bd39fe8cd70d1d8ae729a230276c88ce"} Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.268714 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.268719 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gs6l5" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.281180 4756 scope.go:117] "RemoveContainer" containerID="9042f3a5c3deb9daa509093574c1c70fbb32e9ebd8f3c973883b1160e76ae43c" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.285074 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-45k2m" podStartSLOduration=2.285053122 podStartE2EDuration="2.285053122s" podCreationTimestamp="2026-03-18 14:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:07:21.281358213 +0000 UTC m=+442.595776188" watchObservedRunningTime="2026-03-18 14:07:21.285053122 +0000 UTC m=+442.599471097" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.305500 4756 scope.go:117] "RemoveContainer" containerID="1d71cf9b79bb64fb6ef9118e775f87a02aa070d6dd8aa6c974cd903eaed6c66e" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.334414 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb3189f-716d-4fef-b885-3a031a60d981" path="/var/lib/kubelet/pods/7bb3189f-716d-4fef-b885-3a031a60d981/volumes" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.335093 4756 scope.go:117] "RemoveContainer" containerID="510acfd7467e9c19579eba546928e683416362c816d3015cf2462f9f4c10361a" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.335152 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hrhl"] Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.335180 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hrhl"] Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.363327 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4ckd5"] Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.372469 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4ckd5"] Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.379472 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lkr5r"] Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.379711 4756 scope.go:117] "RemoveContainer" containerID="343794b8ee23454efd8544022f131672c0c1d1488a888229b905d6d4fc63200c" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.389738 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lkr5r"] Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.393330 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gs6l5"] Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.396001 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gs6l5"] Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.398188 4756 scope.go:117] "RemoveContainer" containerID="e3720792b8226fd684f2d032bcb5ba026eeb41406095bfe5fbdb8e3ca7ff5386" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.416012 4756 scope.go:117] "RemoveContainer" containerID="510acfd7467e9c19579eba546928e683416362c816d3015cf2462f9f4c10361a" Mar 18 14:07:21 crc kubenswrapper[4756]: E0318 14:07:21.416953 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"510acfd7467e9c19579eba546928e683416362c816d3015cf2462f9f4c10361a\": container with ID starting with 510acfd7467e9c19579eba546928e683416362c816d3015cf2462f9f4c10361a not found: ID does not exist" containerID="510acfd7467e9c19579eba546928e683416362c816d3015cf2462f9f4c10361a" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.417027 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"510acfd7467e9c19579eba546928e683416362c816d3015cf2462f9f4c10361a"} err="failed to get container status \"510acfd7467e9c19579eba546928e683416362c816d3015cf2462f9f4c10361a\": rpc error: code = NotFound desc = could not find container \"510acfd7467e9c19579eba546928e683416362c816d3015cf2462f9f4c10361a\": container with ID starting with 510acfd7467e9c19579eba546928e683416362c816d3015cf2462f9f4c10361a not found: ID does not exist" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.417077 4756 scope.go:117] "RemoveContainer" containerID="343794b8ee23454efd8544022f131672c0c1d1488a888229b905d6d4fc63200c" Mar 18 14:07:21 crc kubenswrapper[4756]: E0318 14:07:21.417556 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"343794b8ee23454efd8544022f131672c0c1d1488a888229b905d6d4fc63200c\": container with ID starting with 343794b8ee23454efd8544022f131672c0c1d1488a888229b905d6d4fc63200c not found: ID does not exist" containerID="343794b8ee23454efd8544022f131672c0c1d1488a888229b905d6d4fc63200c" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.417581 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"343794b8ee23454efd8544022f131672c0c1d1488a888229b905d6d4fc63200c"} err="failed to get container status \"343794b8ee23454efd8544022f131672c0c1d1488a888229b905d6d4fc63200c\": rpc error: code = NotFound desc = could not find container \"343794b8ee23454efd8544022f131672c0c1d1488a888229b905d6d4fc63200c\": container with ID starting with 343794b8ee23454efd8544022f131672c0c1d1488a888229b905d6d4fc63200c not found: ID does not exist" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.417594 4756 scope.go:117] "RemoveContainer" containerID="e3720792b8226fd684f2d032bcb5ba026eeb41406095bfe5fbdb8e3ca7ff5386" Mar 18 14:07:21 crc kubenswrapper[4756]: E0318 14:07:21.417959 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3720792b8226fd684f2d032bcb5ba026eeb41406095bfe5fbdb8e3ca7ff5386\": container with ID starting with e3720792b8226fd684f2d032bcb5ba026eeb41406095bfe5fbdb8e3ca7ff5386 not found: ID does not exist" containerID="e3720792b8226fd684f2d032bcb5ba026eeb41406095bfe5fbdb8e3ca7ff5386" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.418007 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3720792b8226fd684f2d032bcb5ba026eeb41406095bfe5fbdb8e3ca7ff5386"} err="failed to get container status \"e3720792b8226fd684f2d032bcb5ba026eeb41406095bfe5fbdb8e3ca7ff5386\": rpc error: code = NotFound desc = could not find container \"e3720792b8226fd684f2d032bcb5ba026eeb41406095bfe5fbdb8e3ca7ff5386\": container with ID starting with e3720792b8226fd684f2d032bcb5ba026eeb41406095bfe5fbdb8e3ca7ff5386 not found: ID does not exist" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.418031 4756 scope.go:117] "RemoveContainer" containerID="a8c7eca6a9bb3103aa3afdd50a9beeb16b83bc73ee47336dc20fa3c42bd3d4cd" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.430547 4756 scope.go:117] "RemoveContainer" containerID="0eca24fe5bac3c631ba88bd97407443f30d61316ccb50ad843025b68bf12b372" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.439825 4756 scope.go:117] "RemoveContainer" containerID="62108441e5f5afcb75d9fb7396ed2e8fe0142b603a6e187357f391872fc6e3a2" Mar 18 14:07:21 crc kubenswrapper[4756]: I0318 14:07:21.452912 4756 scope.go:117] "RemoveContainer" containerID="d00bc90cc9e2d8f822d27b8a60b1ff0eba4838cd63baef99a62c6e3551a382dd" Mar 18 14:07:23 crc kubenswrapper[4756]: I0318 14:07:23.321309 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48602255-9809-498e-9c4a-6053ba5ff591" path="/var/lib/kubelet/pods/48602255-9809-498e-9c4a-6053ba5ff591/volumes" Mar 18 14:07:23 crc kubenswrapper[4756]: I0318 14:07:23.321914 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487d1c97-b703-4f1b-8c77-c23b4366a467" path="/var/lib/kubelet/pods/487d1c97-b703-4f1b-8c77-c23b4366a467/volumes" Mar 18 14:07:23 crc kubenswrapper[4756]: I0318 14:07:23.322372 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af462049-61c3-4da5-aeb0-0311404c4741" path="/var/lib/kubelet/pods/af462049-61c3-4da5-aeb0-0311404c4741/volumes" Mar 18 14:07:23 crc kubenswrapper[4756]: I0318 14:07:23.323407 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" path="/var/lib/kubelet/pods/caee1439-b7bb-456e-982f-1c3c3cdb51c3/volumes" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.922931 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4r2hb"] Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924157 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb3189f-716d-4fef-b885-3a031a60d981" containerName="extract-utilities" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924190 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb3189f-716d-4fef-b885-3a031a60d981" containerName="extract-utilities" Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924213 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af462049-61c3-4da5-aeb0-0311404c4741" containerName="extract-content" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924229 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="af462049-61c3-4da5-aeb0-0311404c4741" containerName="extract-content" Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924255 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" containerName="extract-content" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924272 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" containerName="extract-content" Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924299 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af462049-61c3-4da5-aeb0-0311404c4741" containerName="extract-utilities" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924315 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="af462049-61c3-4da5-aeb0-0311404c4741" containerName="extract-utilities" Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924336 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" containerName="extract-utilities" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924351 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" containerName="extract-utilities" Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924376 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48602255-9809-498e-9c4a-6053ba5ff591" containerName="extract-utilities" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924392 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="48602255-9809-498e-9c4a-6053ba5ff591" containerName="extract-utilities" Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924415 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af462049-61c3-4da5-aeb0-0311404c4741" containerName="registry-server" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924430 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="af462049-61c3-4da5-aeb0-0311404c4741" containerName="registry-server" Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924453 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48602255-9809-498e-9c4a-6053ba5ff591" containerName="extract-content" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924468 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="48602255-9809-498e-9c4a-6053ba5ff591" containerName="extract-content" Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924493 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb3189f-716d-4fef-b885-3a031a60d981" containerName="extract-content" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924511 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb3189f-716d-4fef-b885-3a031a60d981" containerName="extract-content" Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924530 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48602255-9809-498e-9c4a-6053ba5ff591" containerName="registry-server" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924545 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="48602255-9809-498e-9c4a-6053ba5ff591" containerName="registry-server" Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924571 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487d1c97-b703-4f1b-8c77-c23b4366a467" containerName="marketplace-operator" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924587 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="487d1c97-b703-4f1b-8c77-c23b4366a467" containerName="marketplace-operator" Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924613 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" containerName="registry-server" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924631 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" containerName="registry-server" Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924652 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb3189f-716d-4fef-b885-3a031a60d981" containerName="registry-server" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924667 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb3189f-716d-4fef-b885-3a031a60d981" containerName="registry-server" Mar 18 14:07:28 crc kubenswrapper[4756]: E0318 14:07:28.924688 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487d1c97-b703-4f1b-8c77-c23b4366a467" containerName="marketplace-operator" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924706 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="487d1c97-b703-4f1b-8c77-c23b4366a467" containerName="marketplace-operator" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.924993 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="af462049-61c3-4da5-aeb0-0311404c4741" containerName="registry-server" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.928456 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="48602255-9809-498e-9c4a-6053ba5ff591" containerName="registry-server" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.928501 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="487d1c97-b703-4f1b-8c77-c23b4366a467" containerName="marketplace-operator" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.928522 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb3189f-716d-4fef-b885-3a031a60d981" containerName="registry-server" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.928548 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="caee1439-b7bb-456e-982f-1c3c3cdb51c3" containerName="registry-server" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.928993 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="487d1c97-b703-4f1b-8c77-c23b4366a467" containerName="marketplace-operator" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.930249 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.933621 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4r2hb"] Mar 18 14:07:28 crc kubenswrapper[4756]: I0318 14:07:28.934001 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.002081 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72438dca-b50b-4607-a61b-a6935f1b296a-utilities\") pod \"certified-operators-4r2hb\" (UID: \"72438dca-b50b-4607-a61b-a6935f1b296a\") " pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.002183 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72438dca-b50b-4607-a61b-a6935f1b296a-catalog-content\") pod \"certified-operators-4r2hb\" (UID: \"72438dca-b50b-4607-a61b-a6935f1b296a\") " pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.002246 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz7km\" (UniqueName: \"kubernetes.io/projected/72438dca-b50b-4607-a61b-a6935f1b296a-kube-api-access-hz7km\") pod \"certified-operators-4r2hb\" (UID: \"72438dca-b50b-4607-a61b-a6935f1b296a\") " pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.102864 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72438dca-b50b-4607-a61b-a6935f1b296a-utilities\") pod \"certified-operators-4r2hb\" (UID: \"72438dca-b50b-4607-a61b-a6935f1b296a\") " pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.102962 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72438dca-b50b-4607-a61b-a6935f1b296a-catalog-content\") pod \"certified-operators-4r2hb\" (UID: \"72438dca-b50b-4607-a61b-a6935f1b296a\") " pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.103043 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz7km\" (UniqueName: \"kubernetes.io/projected/72438dca-b50b-4607-a61b-a6935f1b296a-kube-api-access-hz7km\") pod \"certified-operators-4r2hb\" (UID: \"72438dca-b50b-4607-a61b-a6935f1b296a\") " pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.103946 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72438dca-b50b-4607-a61b-a6935f1b296a-catalog-content\") pod \"certified-operators-4r2hb\" (UID: \"72438dca-b50b-4607-a61b-a6935f1b296a\") " pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.104270 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72438dca-b50b-4607-a61b-a6935f1b296a-utilities\") pod \"certified-operators-4r2hb\" (UID: \"72438dca-b50b-4607-a61b-a6935f1b296a\") " pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.146603 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz7km\" (UniqueName: \"kubernetes.io/projected/72438dca-b50b-4607-a61b-a6935f1b296a-kube-api-access-hz7km\") pod \"certified-operators-4r2hb\" (UID: \"72438dca-b50b-4607-a61b-a6935f1b296a\") " pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.261317 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.734077 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4r2hb"] Mar 18 14:07:29 crc kubenswrapper[4756]: W0318 14:07:29.742010 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72438dca_b50b_4607_a61b_a6935f1b296a.slice/crio-e83619554a9b72549109c186b749e04390abd944b725a4a63791e4a1ab0fc489 WatchSource:0}: Error finding container e83619554a9b72549109c186b749e04390abd944b725a4a63791e4a1ab0fc489: Status 404 returned error can't find the container with id e83619554a9b72549109c186b749e04390abd944b725a4a63791e4a1ab0fc489 Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.924856 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tbqbv"] Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.928231 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.931506 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 14:07:29 crc kubenswrapper[4756]: I0318 14:07:29.935003 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbqbv"] Mar 18 14:07:30 crc kubenswrapper[4756]: I0318 14:07:30.016665 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpztc\" (UniqueName: \"kubernetes.io/projected/5f40a331-0e50-4142-86f5-b0a5a3162f3b-kube-api-access-tpztc\") pod \"redhat-marketplace-tbqbv\" (UID: \"5f40a331-0e50-4142-86f5-b0a5a3162f3b\") " pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:30 crc kubenswrapper[4756]: I0318 14:07:30.016857 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f40a331-0e50-4142-86f5-b0a5a3162f3b-catalog-content\") pod \"redhat-marketplace-tbqbv\" (UID: \"5f40a331-0e50-4142-86f5-b0a5a3162f3b\") " pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:30 crc kubenswrapper[4756]: I0318 14:07:30.016917 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f40a331-0e50-4142-86f5-b0a5a3162f3b-utilities\") pod \"redhat-marketplace-tbqbv\" (UID: \"5f40a331-0e50-4142-86f5-b0a5a3162f3b\") " pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:30 crc kubenswrapper[4756]: I0318 14:07:30.117956 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpztc\" (UniqueName: \"kubernetes.io/projected/5f40a331-0e50-4142-86f5-b0a5a3162f3b-kube-api-access-tpztc\") pod \"redhat-marketplace-tbqbv\" (UID: \"5f40a331-0e50-4142-86f5-b0a5a3162f3b\") " pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:30 crc kubenswrapper[4756]: I0318 14:07:30.118093 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f40a331-0e50-4142-86f5-b0a5a3162f3b-catalog-content\") pod \"redhat-marketplace-tbqbv\" (UID: \"5f40a331-0e50-4142-86f5-b0a5a3162f3b\") " pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:30 crc kubenswrapper[4756]: I0318 14:07:30.118167 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f40a331-0e50-4142-86f5-b0a5a3162f3b-utilities\") pod \"redhat-marketplace-tbqbv\" (UID: \"5f40a331-0e50-4142-86f5-b0a5a3162f3b\") " pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:30 crc kubenswrapper[4756]: I0318 14:07:30.119339 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f40a331-0e50-4142-86f5-b0a5a3162f3b-utilities\") pod \"redhat-marketplace-tbqbv\" (UID: \"5f40a331-0e50-4142-86f5-b0a5a3162f3b\") " pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:30 crc kubenswrapper[4756]: I0318 14:07:30.120281 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f40a331-0e50-4142-86f5-b0a5a3162f3b-catalog-content\") pod \"redhat-marketplace-tbqbv\" (UID: \"5f40a331-0e50-4142-86f5-b0a5a3162f3b\") " pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:30 crc kubenswrapper[4756]: I0318 14:07:30.153290 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpztc\" (UniqueName: \"kubernetes.io/projected/5f40a331-0e50-4142-86f5-b0a5a3162f3b-kube-api-access-tpztc\") pod \"redhat-marketplace-tbqbv\" (UID: \"5f40a331-0e50-4142-86f5-b0a5a3162f3b\") " pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:30 crc kubenswrapper[4756]: I0318 14:07:30.252974 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:30 crc kubenswrapper[4756]: I0318 14:07:30.340957 4756 generic.go:334] "Generic (PLEG): container finished" podID="72438dca-b50b-4607-a61b-a6935f1b296a" containerID="6397c270a93f0527faa12c884225b92b00fbe61b0d600df6df99e9a75d18a7c9" exitCode=0 Mar 18 14:07:30 crc kubenswrapper[4756]: I0318 14:07:30.341236 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4r2hb" event={"ID":"72438dca-b50b-4607-a61b-a6935f1b296a","Type":"ContainerDied","Data":"6397c270a93f0527faa12c884225b92b00fbe61b0d600df6df99e9a75d18a7c9"} Mar 18 14:07:30 crc kubenswrapper[4756]: I0318 14:07:30.341373 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4r2hb" event={"ID":"72438dca-b50b-4607-a61b-a6935f1b296a","Type":"ContainerStarted","Data":"e83619554a9b72549109c186b749e04390abd944b725a4a63791e4a1ab0fc489"} Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:30.540350 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbqbv"] Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.314344 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mwdj7"] Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.316345 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.323434 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.335049 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mwdj7"] Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.353075 4756 generic.go:334] "Generic (PLEG): container finished" podID="5f40a331-0e50-4142-86f5-b0a5a3162f3b" containerID="d828dd2a4ba050bcf5ed843fa3940ed19b7d3979aae2cd57c5b31e3c456c1fb0" exitCode=0 Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.353265 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbqbv" event={"ID":"5f40a331-0e50-4142-86f5-b0a5a3162f3b","Type":"ContainerDied","Data":"d828dd2a4ba050bcf5ed843fa3940ed19b7d3979aae2cd57c5b31e3c456c1fb0"} Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.353305 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbqbv" event={"ID":"5f40a331-0e50-4142-86f5-b0a5a3162f3b","Type":"ContainerStarted","Data":"885abec0a4cb4f133b34b1300d9adedff187529edd2623b868a174ebc9210899"} Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.355958 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4r2hb" event={"ID":"72438dca-b50b-4607-a61b-a6935f1b296a","Type":"ContainerStarted","Data":"1015789cdcf192571368f914205a1004df338efe203a33e43995e7d184d769a4"} Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.438043 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/998c254a-7789-4cf4-9445-1f6a76068bd0-utilities\") pod \"redhat-operators-mwdj7\" (UID: \"998c254a-7789-4cf4-9445-1f6a76068bd0\") " pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.438095 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/998c254a-7789-4cf4-9445-1f6a76068bd0-catalog-content\") pod \"redhat-operators-mwdj7\" (UID: \"998c254a-7789-4cf4-9445-1f6a76068bd0\") " pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.438232 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngv4v\" (UniqueName: \"kubernetes.io/projected/998c254a-7789-4cf4-9445-1f6a76068bd0-kube-api-access-ngv4v\") pod \"redhat-operators-mwdj7\" (UID: \"998c254a-7789-4cf4-9445-1f6a76068bd0\") " pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.538868 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngv4v\" (UniqueName: \"kubernetes.io/projected/998c254a-7789-4cf4-9445-1f6a76068bd0-kube-api-access-ngv4v\") pod \"redhat-operators-mwdj7\" (UID: \"998c254a-7789-4cf4-9445-1f6a76068bd0\") " pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.538938 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/998c254a-7789-4cf4-9445-1f6a76068bd0-utilities\") pod \"redhat-operators-mwdj7\" (UID: \"998c254a-7789-4cf4-9445-1f6a76068bd0\") " pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.538958 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/998c254a-7789-4cf4-9445-1f6a76068bd0-catalog-content\") pod \"redhat-operators-mwdj7\" (UID: \"998c254a-7789-4cf4-9445-1f6a76068bd0\") " pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.539409 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/998c254a-7789-4cf4-9445-1f6a76068bd0-catalog-content\") pod \"redhat-operators-mwdj7\" (UID: \"998c254a-7789-4cf4-9445-1f6a76068bd0\") " pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.539727 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/998c254a-7789-4cf4-9445-1f6a76068bd0-utilities\") pod \"redhat-operators-mwdj7\" (UID: \"998c254a-7789-4cf4-9445-1f6a76068bd0\") " pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.556368 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngv4v\" (UniqueName: \"kubernetes.io/projected/998c254a-7789-4cf4-9445-1f6a76068bd0-kube-api-access-ngv4v\") pod \"redhat-operators-mwdj7\" (UID: \"998c254a-7789-4cf4-9445-1f6a76068bd0\") " pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:31 crc kubenswrapper[4756]: I0318 14:07:31.707952 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.136658 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mwdj7"] Mar 18 14:07:32 crc kubenswrapper[4756]: W0318 14:07:32.145499 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod998c254a_7789_4cf4_9445_1f6a76068bd0.slice/crio-6e48c4c748ef84b19a41d6b328f9f956c13cf67f9a800dfb885e77a5e99a5675 WatchSource:0}: Error finding container 6e48c4c748ef84b19a41d6b328f9f956c13cf67f9a800dfb885e77a5e99a5675: Status 404 returned error can't find the container with id 6e48c4c748ef84b19a41d6b328f9f956c13cf67f9a800dfb885e77a5e99a5675 Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.302930 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pb94g"] Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.303847 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.309797 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.323061 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pb94g"] Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.362867 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25847598-c3f0-419a-b422-6b35f9f71311-utilities\") pod \"community-operators-pb94g\" (UID: \"25847598-c3f0-419a-b422-6b35f9f71311\") " pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.362947 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25847598-c3f0-419a-b422-6b35f9f71311-catalog-content\") pod \"community-operators-pb94g\" (UID: \"25847598-c3f0-419a-b422-6b35f9f71311\") " pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.362979 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nffpw\" (UniqueName: \"kubernetes.io/projected/25847598-c3f0-419a-b422-6b35f9f71311-kube-api-access-nffpw\") pod \"community-operators-pb94g\" (UID: \"25847598-c3f0-419a-b422-6b35f9f71311\") " pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.365729 4756 generic.go:334] "Generic (PLEG): container finished" podID="998c254a-7789-4cf4-9445-1f6a76068bd0" containerID="01c9135d06813a0410e70fcfdcc612d3c6f91acff82fd8663183f881ef83825b" exitCode=0 Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.365814 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwdj7" event={"ID":"998c254a-7789-4cf4-9445-1f6a76068bd0","Type":"ContainerDied","Data":"01c9135d06813a0410e70fcfdcc612d3c6f91acff82fd8663183f881ef83825b"} Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.365851 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwdj7" event={"ID":"998c254a-7789-4cf4-9445-1f6a76068bd0","Type":"ContainerStarted","Data":"6e48c4c748ef84b19a41d6b328f9f956c13cf67f9a800dfb885e77a5e99a5675"} Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.369299 4756 generic.go:334] "Generic (PLEG): container finished" podID="72438dca-b50b-4607-a61b-a6935f1b296a" containerID="1015789cdcf192571368f914205a1004df338efe203a33e43995e7d184d769a4" exitCode=0 Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.369333 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4r2hb" event={"ID":"72438dca-b50b-4607-a61b-a6935f1b296a","Type":"ContainerDied","Data":"1015789cdcf192571368f914205a1004df338efe203a33e43995e7d184d769a4"} Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.369353 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4r2hb" event={"ID":"72438dca-b50b-4607-a61b-a6935f1b296a","Type":"ContainerStarted","Data":"e791d33f65be8986ff8921f182c73422a99a354e583da8222c7292c8a0e6f678"} Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.464273 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25847598-c3f0-419a-b422-6b35f9f71311-catalog-content\") pod \"community-operators-pb94g\" (UID: \"25847598-c3f0-419a-b422-6b35f9f71311\") " pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.464321 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nffpw\" (UniqueName: \"kubernetes.io/projected/25847598-c3f0-419a-b422-6b35f9f71311-kube-api-access-nffpw\") pod \"community-operators-pb94g\" (UID: \"25847598-c3f0-419a-b422-6b35f9f71311\") " pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.464663 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25847598-c3f0-419a-b422-6b35f9f71311-utilities\") pod \"community-operators-pb94g\" (UID: \"25847598-c3f0-419a-b422-6b35f9f71311\") " pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.464903 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25847598-c3f0-419a-b422-6b35f9f71311-utilities\") pod \"community-operators-pb94g\" (UID: \"25847598-c3f0-419a-b422-6b35f9f71311\") " pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.464973 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25847598-c3f0-419a-b422-6b35f9f71311-catalog-content\") pod \"community-operators-pb94g\" (UID: \"25847598-c3f0-419a-b422-6b35f9f71311\") " pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.493963 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nffpw\" (UniqueName: \"kubernetes.io/projected/25847598-c3f0-419a-b422-6b35f9f71311-kube-api-access-nffpw\") pod \"community-operators-pb94g\" (UID: \"25847598-c3f0-419a-b422-6b35f9f71311\") " pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.623732 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.852096 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4r2hb" podStartSLOduration=3.252397532 podStartE2EDuration="4.852080378s" podCreationTimestamp="2026-03-18 14:07:28 +0000 UTC" firstStartedPulling="2026-03-18 14:07:30.343864989 +0000 UTC m=+451.658283004" lastFinishedPulling="2026-03-18 14:07:31.943547875 +0000 UTC m=+453.257965850" observedRunningTime="2026-03-18 14:07:32.401689515 +0000 UTC m=+453.716107500" watchObservedRunningTime="2026-03-18 14:07:32.852080378 +0000 UTC m=+454.166498353" Mar 18 14:07:32 crc kubenswrapper[4756]: I0318 14:07:32.853985 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pb94g"] Mar 18 14:07:32 crc kubenswrapper[4756]: W0318 14:07:32.859369 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25847598_c3f0_419a_b422_6b35f9f71311.slice/crio-acda8be0354531366e2f4e336cec135b6c0eaaf411bf654ba0e83abeb13e7cab WatchSource:0}: Error finding container acda8be0354531366e2f4e336cec135b6c0eaaf411bf654ba0e83abeb13e7cab: Status 404 returned error can't find the container with id acda8be0354531366e2f4e336cec135b6c0eaaf411bf654ba0e83abeb13e7cab Mar 18 14:07:33 crc kubenswrapper[4756]: I0318 14:07:33.378238 4756 generic.go:334] "Generic (PLEG): container finished" podID="25847598-c3f0-419a-b422-6b35f9f71311" containerID="f2685483d1a07d055837cfb91685fd601760475b196eb8fc7987ea6e91403b42" exitCode=0 Mar 18 14:07:33 crc kubenswrapper[4756]: I0318 14:07:33.378342 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb94g" event={"ID":"25847598-c3f0-419a-b422-6b35f9f71311","Type":"ContainerDied","Data":"f2685483d1a07d055837cfb91685fd601760475b196eb8fc7987ea6e91403b42"} Mar 18 14:07:33 crc kubenswrapper[4756]: I0318 14:07:33.378381 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb94g" event={"ID":"25847598-c3f0-419a-b422-6b35f9f71311","Type":"ContainerStarted","Data":"acda8be0354531366e2f4e336cec135b6c0eaaf411bf654ba0e83abeb13e7cab"} Mar 18 14:07:33 crc kubenswrapper[4756]: I0318 14:07:33.382891 4756 generic.go:334] "Generic (PLEG): container finished" podID="5f40a331-0e50-4142-86f5-b0a5a3162f3b" containerID="3c6d2120f410383451449fffaee6cd33a5e57771aa935cd4f786c2c27a3a940c" exitCode=0 Mar 18 14:07:33 crc kubenswrapper[4756]: I0318 14:07:33.384212 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbqbv" event={"ID":"5f40a331-0e50-4142-86f5-b0a5a3162f3b","Type":"ContainerDied","Data":"3c6d2120f410383451449fffaee6cd33a5e57771aa935cd4f786c2c27a3a940c"} Mar 18 14:07:34 crc kubenswrapper[4756]: I0318 14:07:34.393645 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbqbv" event={"ID":"5f40a331-0e50-4142-86f5-b0a5a3162f3b","Type":"ContainerStarted","Data":"3a410260062583368f4ec4402765f7f8e1c1dadf31fbe39c2505bc1ad340a9b2"} Mar 18 14:07:34 crc kubenswrapper[4756]: I0318 14:07:34.400227 4756 generic.go:334] "Generic (PLEG): container finished" podID="998c254a-7789-4cf4-9445-1f6a76068bd0" containerID="27797b6b42dbee4879e8090fa87b1a02e08e954b43d3d7e8f182515bc4495444" exitCode=0 Mar 18 14:07:34 crc kubenswrapper[4756]: I0318 14:07:34.400276 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwdj7" event={"ID":"998c254a-7789-4cf4-9445-1f6a76068bd0","Type":"ContainerDied","Data":"27797b6b42dbee4879e8090fa87b1a02e08e954b43d3d7e8f182515bc4495444"} Mar 18 14:07:34 crc kubenswrapper[4756]: I0318 14:07:34.402782 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb94g" event={"ID":"25847598-c3f0-419a-b422-6b35f9f71311","Type":"ContainerStarted","Data":"bdc7db3f24bab4604e6a793cf5b8300af8e4b5d2e9f3779c5c86a2e3306f159a"} Mar 18 14:07:34 crc kubenswrapper[4756]: I0318 14:07:34.426351 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tbqbv" podStartSLOduration=2.934704345 podStartE2EDuration="5.426323877s" podCreationTimestamp="2026-03-18 14:07:29 +0000 UTC" firstStartedPulling="2026-03-18 14:07:31.354882594 +0000 UTC m=+452.669300579" lastFinishedPulling="2026-03-18 14:07:33.846502136 +0000 UTC m=+455.160920111" observedRunningTime="2026-03-18 14:07:34.416891964 +0000 UTC m=+455.731309969" watchObservedRunningTime="2026-03-18 14:07:34.426323877 +0000 UTC m=+455.740741892" Mar 18 14:07:35 crc kubenswrapper[4756]: I0318 14:07:35.410579 4756 generic.go:334] "Generic (PLEG): container finished" podID="25847598-c3f0-419a-b422-6b35f9f71311" containerID="bdc7db3f24bab4604e6a793cf5b8300af8e4b5d2e9f3779c5c86a2e3306f159a" exitCode=0 Mar 18 14:07:35 crc kubenswrapper[4756]: I0318 14:07:35.410629 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb94g" event={"ID":"25847598-c3f0-419a-b422-6b35f9f71311","Type":"ContainerDied","Data":"bdc7db3f24bab4604e6a793cf5b8300af8e4b5d2e9f3779c5c86a2e3306f159a"} Mar 18 14:07:35 crc kubenswrapper[4756]: I0318 14:07:35.413461 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwdj7" event={"ID":"998c254a-7789-4cf4-9445-1f6a76068bd0","Type":"ContainerStarted","Data":"5b008252795efff961a7f0890bd99e201f1441a55a028a1961b693c6117afa13"} Mar 18 14:07:35 crc kubenswrapper[4756]: I0318 14:07:35.449379 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mwdj7" podStartSLOduration=1.91417471 podStartE2EDuration="4.449356887s" podCreationTimestamp="2026-03-18 14:07:31 +0000 UTC" firstStartedPulling="2026-03-18 14:07:32.367752691 +0000 UTC m=+453.682170666" lastFinishedPulling="2026-03-18 14:07:34.902934858 +0000 UTC m=+456.217352843" observedRunningTime="2026-03-18 14:07:35.446055271 +0000 UTC m=+456.760473246" watchObservedRunningTime="2026-03-18 14:07:35.449356887 +0000 UTC m=+456.763774872" Mar 18 14:07:36 crc kubenswrapper[4756]: I0318 14:07:36.429825 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb94g" event={"ID":"25847598-c3f0-419a-b422-6b35f9f71311","Type":"ContainerStarted","Data":"e6730ec25cdadc0b0f00e8b0c212a52914b1f2c2fafa1eaee80a7d915faa080c"} Mar 18 14:07:36 crc kubenswrapper[4756]: I0318 14:07:36.915854 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:07:36 crc kubenswrapper[4756]: I0318 14:07:36.915927 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:07:37 crc kubenswrapper[4756]: I0318 14:07:37.730229 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" podUID="a15d24cf-4182-44bb-9d60-33649137cc83" containerName="registry" containerID="cri-o://e24d33b52f39665a23cc6883f6a87c34b66ef6825cf95aca6a40e45f4b59bca2" gracePeriod=30 Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.063080 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.088341 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pb94g" podStartSLOduration=3.6695880069999998 podStartE2EDuration="6.088308964s" podCreationTimestamp="2026-03-18 14:07:32 +0000 UTC" firstStartedPulling="2026-03-18 14:07:33.381206353 +0000 UTC m=+454.695624328" lastFinishedPulling="2026-03-18 14:07:35.79992731 +0000 UTC m=+457.114345285" observedRunningTime="2026-03-18 14:07:36.447273982 +0000 UTC m=+457.761691967" watchObservedRunningTime="2026-03-18 14:07:38.088308964 +0000 UTC m=+459.402726949" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.149251 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7nl8\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-kube-api-access-w7nl8\") pod \"a15d24cf-4182-44bb-9d60-33649137cc83\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.149338 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a15d24cf-4182-44bb-9d60-33649137cc83-ca-trust-extracted\") pod \"a15d24cf-4182-44bb-9d60-33649137cc83\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.149382 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-registry-tls\") pod \"a15d24cf-4182-44bb-9d60-33649137cc83\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.149431 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a15d24cf-4182-44bb-9d60-33649137cc83-trusted-ca\") pod \"a15d24cf-4182-44bb-9d60-33649137cc83\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.150535 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a15d24cf-4182-44bb-9d60-33649137cc83-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a15d24cf-4182-44bb-9d60-33649137cc83" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.150665 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a15d24cf-4182-44bb-9d60-33649137cc83\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.150693 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a15d24cf-4182-44bb-9d60-33649137cc83-registry-certificates\") pod \"a15d24cf-4182-44bb-9d60-33649137cc83\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.150750 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a15d24cf-4182-44bb-9d60-33649137cc83-installation-pull-secrets\") pod \"a15d24cf-4182-44bb-9d60-33649137cc83\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.151569 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a15d24cf-4182-44bb-9d60-33649137cc83-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a15d24cf-4182-44bb-9d60-33649137cc83" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.151676 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-bound-sa-token\") pod \"a15d24cf-4182-44bb-9d60-33649137cc83\" (UID: \"a15d24cf-4182-44bb-9d60-33649137cc83\") " Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.151945 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a15d24cf-4182-44bb-9d60-33649137cc83-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.151969 4756 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a15d24cf-4182-44bb-9d60-33649137cc83-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.155470 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-kube-api-access-w7nl8" (OuterVolumeSpecName: "kube-api-access-w7nl8") pod "a15d24cf-4182-44bb-9d60-33649137cc83" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83"). InnerVolumeSpecName "kube-api-access-w7nl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.157391 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15d24cf-4182-44bb-9d60-33649137cc83-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a15d24cf-4182-44bb-9d60-33649137cc83" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.157530 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a15d24cf-4182-44bb-9d60-33649137cc83" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.157518 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a15d24cf-4182-44bb-9d60-33649137cc83" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.166709 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a15d24cf-4182-44bb-9d60-33649137cc83" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.166714 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a15d24cf-4182-44bb-9d60-33649137cc83-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a15d24cf-4182-44bb-9d60-33649137cc83" (UID: "a15d24cf-4182-44bb-9d60-33649137cc83"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.253602 4756 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.253656 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7nl8\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-kube-api-access-w7nl8\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.253677 4756 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a15d24cf-4182-44bb-9d60-33649137cc83-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.253695 4756 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a15d24cf-4182-44bb-9d60-33649137cc83-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.253713 4756 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a15d24cf-4182-44bb-9d60-33649137cc83-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.445201 4756 generic.go:334] "Generic (PLEG): container finished" podID="a15d24cf-4182-44bb-9d60-33649137cc83" containerID="e24d33b52f39665a23cc6883f6a87c34b66ef6825cf95aca6a40e45f4b59bca2" exitCode=0 Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.445279 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" event={"ID":"a15d24cf-4182-44bb-9d60-33649137cc83","Type":"ContainerDied","Data":"e24d33b52f39665a23cc6883f6a87c34b66ef6825cf95aca6a40e45f4b59bca2"} Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.445353 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" event={"ID":"a15d24cf-4182-44bb-9d60-33649137cc83","Type":"ContainerDied","Data":"eefa0a132bc5d97acb29465466a79a2100c2aebadfc104b20f5ec1adbb109e32"} Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.445305 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9pdb4" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.445399 4756 scope.go:117] "RemoveContainer" containerID="e24d33b52f39665a23cc6883f6a87c34b66ef6825cf95aca6a40e45f4b59bca2" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.474741 4756 scope.go:117] "RemoveContainer" containerID="e24d33b52f39665a23cc6883f6a87c34b66ef6825cf95aca6a40e45f4b59bca2" Mar 18 14:07:38 crc kubenswrapper[4756]: E0318 14:07:38.477321 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e24d33b52f39665a23cc6883f6a87c34b66ef6825cf95aca6a40e45f4b59bca2\": container with ID starting with e24d33b52f39665a23cc6883f6a87c34b66ef6825cf95aca6a40e45f4b59bca2 not found: ID does not exist" containerID="e24d33b52f39665a23cc6883f6a87c34b66ef6825cf95aca6a40e45f4b59bca2" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.477395 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e24d33b52f39665a23cc6883f6a87c34b66ef6825cf95aca6a40e45f4b59bca2"} err="failed to get container status \"e24d33b52f39665a23cc6883f6a87c34b66ef6825cf95aca6a40e45f4b59bca2\": rpc error: code = NotFound desc = could not find container \"e24d33b52f39665a23cc6883f6a87c34b66ef6825cf95aca6a40e45f4b59bca2\": container with ID starting with e24d33b52f39665a23cc6883f6a87c34b66ef6825cf95aca6a40e45f4b59bca2 not found: ID does not exist" Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.495800 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9pdb4"] Mar 18 14:07:38 crc kubenswrapper[4756]: I0318 14:07:38.501174 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9pdb4"] Mar 18 14:07:39 crc kubenswrapper[4756]: I0318 14:07:39.261469 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:39 crc kubenswrapper[4756]: I0318 14:07:39.261749 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:39 crc kubenswrapper[4756]: I0318 14:07:39.304969 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:39 crc kubenswrapper[4756]: I0318 14:07:39.329524 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15d24cf-4182-44bb-9d60-33649137cc83" path="/var/lib/kubelet/pods/a15d24cf-4182-44bb-9d60-33649137cc83/volumes" Mar 18 14:07:39 crc kubenswrapper[4756]: I0318 14:07:39.507232 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4r2hb" Mar 18 14:07:40 crc kubenswrapper[4756]: I0318 14:07:40.253722 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:40 crc kubenswrapper[4756]: I0318 14:07:40.253809 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:40 crc kubenswrapper[4756]: I0318 14:07:40.303505 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:40 crc kubenswrapper[4756]: I0318 14:07:40.527569 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tbqbv" Mar 18 14:07:41 crc kubenswrapper[4756]: I0318 14:07:41.708608 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:41 crc kubenswrapper[4756]: I0318 14:07:41.709005 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:42 crc kubenswrapper[4756]: I0318 14:07:42.624322 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:42 crc kubenswrapper[4756]: I0318 14:07:42.624643 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:42 crc kubenswrapper[4756]: I0318 14:07:42.665526 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:42 crc kubenswrapper[4756]: I0318 14:07:42.779404 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mwdj7" podUID="998c254a-7789-4cf4-9445-1f6a76068bd0" containerName="registry-server" probeResult="failure" output=< Mar 18 14:07:42 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:07:42 crc kubenswrapper[4756]: > Mar 18 14:07:43 crc kubenswrapper[4756]: I0318 14:07:43.544557 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:07:51 crc kubenswrapper[4756]: I0318 14:07:51.778592 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:07:51 crc kubenswrapper[4756]: I0318 14:07:51.825034 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mwdj7" Mar 18 14:08:00 crc kubenswrapper[4756]: I0318 14:08:00.140225 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564048-nw25j"] Mar 18 14:08:00 crc kubenswrapper[4756]: E0318 14:08:00.140868 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15d24cf-4182-44bb-9d60-33649137cc83" containerName="registry" Mar 18 14:08:00 crc kubenswrapper[4756]: I0318 14:08:00.140882 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15d24cf-4182-44bb-9d60-33649137cc83" containerName="registry" Mar 18 14:08:00 crc kubenswrapper[4756]: I0318 14:08:00.141002 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15d24cf-4182-44bb-9d60-33649137cc83" containerName="registry" Mar 18 14:08:00 crc kubenswrapper[4756]: I0318 14:08:00.141423 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564048-nw25j" Mar 18 14:08:00 crc kubenswrapper[4756]: I0318 14:08:00.143614 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:08:00 crc kubenswrapper[4756]: I0318 14:08:00.144561 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:08:00 crc kubenswrapper[4756]: I0318 14:08:00.146313 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:08:00 crc kubenswrapper[4756]: I0318 14:08:00.154776 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564048-nw25j"] Mar 18 14:08:00 crc kubenswrapper[4756]: I0318 14:08:00.235248 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnlnj\" (UniqueName: \"kubernetes.io/projected/208a151e-414a-426e-9133-9bbcecd3445e-kube-api-access-xnlnj\") pod \"auto-csr-approver-29564048-nw25j\" (UID: \"208a151e-414a-426e-9133-9bbcecd3445e\") " pod="openshift-infra/auto-csr-approver-29564048-nw25j" Mar 18 14:08:00 crc kubenswrapper[4756]: I0318 14:08:00.336156 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnlnj\" (UniqueName: \"kubernetes.io/projected/208a151e-414a-426e-9133-9bbcecd3445e-kube-api-access-xnlnj\") pod \"auto-csr-approver-29564048-nw25j\" (UID: \"208a151e-414a-426e-9133-9bbcecd3445e\") " pod="openshift-infra/auto-csr-approver-29564048-nw25j" Mar 18 14:08:00 crc kubenswrapper[4756]: I0318 14:08:00.360913 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnlnj\" (UniqueName: \"kubernetes.io/projected/208a151e-414a-426e-9133-9bbcecd3445e-kube-api-access-xnlnj\") pod \"auto-csr-approver-29564048-nw25j\" (UID: \"208a151e-414a-426e-9133-9bbcecd3445e\") " pod="openshift-infra/auto-csr-approver-29564048-nw25j" Mar 18 14:08:00 crc kubenswrapper[4756]: I0318 14:08:00.459790 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564048-nw25j" Mar 18 14:08:00 crc kubenswrapper[4756]: I0318 14:08:00.697914 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564048-nw25j"] Mar 18 14:08:00 crc kubenswrapper[4756]: W0318 14:08:00.707707 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod208a151e_414a_426e_9133_9bbcecd3445e.slice/crio-f5ae89d974610b91b843defd0090b19286cafe02e83be3841276cbbf10017d66 WatchSource:0}: Error finding container f5ae89d974610b91b843defd0090b19286cafe02e83be3841276cbbf10017d66: Status 404 returned error can't find the container with id f5ae89d974610b91b843defd0090b19286cafe02e83be3841276cbbf10017d66 Mar 18 14:08:01 crc kubenswrapper[4756]: I0318 14:08:01.589730 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564048-nw25j" event={"ID":"208a151e-414a-426e-9133-9bbcecd3445e","Type":"ContainerStarted","Data":"f5ae89d974610b91b843defd0090b19286cafe02e83be3841276cbbf10017d66"} Mar 18 14:08:06 crc kubenswrapper[4756]: I0318 14:08:06.620047 4756 generic.go:334] "Generic (PLEG): container finished" podID="208a151e-414a-426e-9133-9bbcecd3445e" containerID="a8d2fb1e8122ad15752e4e8b6d34e91a4d25bf4f6de9c2f45ee9fb7d9c296ead" exitCode=0 Mar 18 14:08:06 crc kubenswrapper[4756]: I0318 14:08:06.620153 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564048-nw25j" event={"ID":"208a151e-414a-426e-9133-9bbcecd3445e","Type":"ContainerDied","Data":"a8d2fb1e8122ad15752e4e8b6d34e91a4d25bf4f6de9c2f45ee9fb7d9c296ead"} Mar 18 14:08:06 crc kubenswrapper[4756]: I0318 14:08:06.915571 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:08:06 crc kubenswrapper[4756]: I0318 14:08:06.915936 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:08:07 crc kubenswrapper[4756]: I0318 14:08:07.905982 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564048-nw25j" Mar 18 14:08:07 crc kubenswrapper[4756]: I0318 14:08:07.926884 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnlnj\" (UniqueName: \"kubernetes.io/projected/208a151e-414a-426e-9133-9bbcecd3445e-kube-api-access-xnlnj\") pod \"208a151e-414a-426e-9133-9bbcecd3445e\" (UID: \"208a151e-414a-426e-9133-9bbcecd3445e\") " Mar 18 14:08:07 crc kubenswrapper[4756]: I0318 14:08:07.933603 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/208a151e-414a-426e-9133-9bbcecd3445e-kube-api-access-xnlnj" (OuterVolumeSpecName: "kube-api-access-xnlnj") pod "208a151e-414a-426e-9133-9bbcecd3445e" (UID: "208a151e-414a-426e-9133-9bbcecd3445e"). InnerVolumeSpecName "kube-api-access-xnlnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:08:08 crc kubenswrapper[4756]: I0318 14:08:08.028349 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnlnj\" (UniqueName: \"kubernetes.io/projected/208a151e-414a-426e-9133-9bbcecd3445e-kube-api-access-xnlnj\") on node \"crc\" DevicePath \"\"" Mar 18 14:08:08 crc kubenswrapper[4756]: I0318 14:08:08.638145 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564048-nw25j" event={"ID":"208a151e-414a-426e-9133-9bbcecd3445e","Type":"ContainerDied","Data":"f5ae89d974610b91b843defd0090b19286cafe02e83be3841276cbbf10017d66"} Mar 18 14:08:08 crc kubenswrapper[4756]: I0318 14:08:08.638655 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5ae89d974610b91b843defd0090b19286cafe02e83be3841276cbbf10017d66" Mar 18 14:08:08 crc kubenswrapper[4756]: I0318 14:08:08.638459 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564048-nw25j" Mar 18 14:08:08 crc kubenswrapper[4756]: I0318 14:08:08.984737 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564042-6mtlx"] Mar 18 14:08:08 crc kubenswrapper[4756]: I0318 14:08:08.997698 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564042-6mtlx"] Mar 18 14:08:09 crc kubenswrapper[4756]: I0318 14:08:09.324981 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfa461e5-a4e9-4cfa-a279-df6d4a56c973" path="/var/lib/kubelet/pods/bfa461e5-a4e9-4cfa-a279-df6d4a56c973/volumes" Mar 18 14:08:36 crc kubenswrapper[4756]: I0318 14:08:36.915454 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:08:36 crc kubenswrapper[4756]: I0318 14:08:36.917309 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:08:36 crc kubenswrapper[4756]: I0318 14:08:36.917425 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:08:36 crc kubenswrapper[4756]: I0318 14:08:36.918331 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7986574aedd1bbfebd839420217d09df3d36a81aa68ea117b5469df20091c844"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:08:36 crc kubenswrapper[4756]: I0318 14:08:36.918404 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://7986574aedd1bbfebd839420217d09df3d36a81aa68ea117b5469df20091c844" gracePeriod=600 Mar 18 14:08:37 crc kubenswrapper[4756]: I0318 14:08:37.834047 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="7986574aedd1bbfebd839420217d09df3d36a81aa68ea117b5469df20091c844" exitCode=0 Mar 18 14:08:37 crc kubenswrapper[4756]: I0318 14:08:37.834094 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"7986574aedd1bbfebd839420217d09df3d36a81aa68ea117b5469df20091c844"} Mar 18 14:08:37 crc kubenswrapper[4756]: I0318 14:08:37.834780 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"5720abaa535d38bd7f462e46e5802b411c62642cbf7053424674cf7b459cf96b"} Mar 18 14:08:37 crc kubenswrapper[4756]: I0318 14:08:37.834818 4756 scope.go:117] "RemoveContainer" containerID="bfc1f7d446b7e4eb090bbf2ff9019b1a637e21a22b8fbc2cec9026448ae87670" Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.145605 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564050-f6gfb"] Mar 18 14:10:00 crc kubenswrapper[4756]: E0318 14:10:00.146472 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208a151e-414a-426e-9133-9bbcecd3445e" containerName="oc" Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.146490 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="208a151e-414a-426e-9133-9bbcecd3445e" containerName="oc" Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.146653 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="208a151e-414a-426e-9133-9bbcecd3445e" containerName="oc" Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.147209 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564050-f6gfb" Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.150493 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.150519 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.151738 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.157438 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564050-f6gfb"] Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.249353 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44xgj\" (UniqueName: \"kubernetes.io/projected/21801a7f-54fd-4ea9-92d5-be66dce1326d-kube-api-access-44xgj\") pod \"auto-csr-approver-29564050-f6gfb\" (UID: \"21801a7f-54fd-4ea9-92d5-be66dce1326d\") " pod="openshift-infra/auto-csr-approver-29564050-f6gfb" Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.349928 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44xgj\" (UniqueName: \"kubernetes.io/projected/21801a7f-54fd-4ea9-92d5-be66dce1326d-kube-api-access-44xgj\") pod \"auto-csr-approver-29564050-f6gfb\" (UID: \"21801a7f-54fd-4ea9-92d5-be66dce1326d\") " pod="openshift-infra/auto-csr-approver-29564050-f6gfb" Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.378688 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44xgj\" (UniqueName: \"kubernetes.io/projected/21801a7f-54fd-4ea9-92d5-be66dce1326d-kube-api-access-44xgj\") pod \"auto-csr-approver-29564050-f6gfb\" (UID: \"21801a7f-54fd-4ea9-92d5-be66dce1326d\") " pod="openshift-infra/auto-csr-approver-29564050-f6gfb" Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.465583 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564050-f6gfb" Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.684410 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564050-f6gfb"] Mar 18 14:10:00 crc kubenswrapper[4756]: W0318 14:10:00.689957 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21801a7f_54fd_4ea9_92d5_be66dce1326d.slice/crio-60026872121a52268b432da31f099a7fc173ddbdc18cf455a51a7a3ed6f99f29 WatchSource:0}: Error finding container 60026872121a52268b432da31f099a7fc173ddbdc18cf455a51a7a3ed6f99f29: Status 404 returned error can't find the container with id 60026872121a52268b432da31f099a7fc173ddbdc18cf455a51a7a3ed6f99f29 Mar 18 14:10:00 crc kubenswrapper[4756]: I0318 14:10:00.692134 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:10:01 crc kubenswrapper[4756]: I0318 14:10:01.380988 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564050-f6gfb" event={"ID":"21801a7f-54fd-4ea9-92d5-be66dce1326d","Type":"ContainerStarted","Data":"60026872121a52268b432da31f099a7fc173ddbdc18cf455a51a7a3ed6f99f29"} Mar 18 14:10:02 crc kubenswrapper[4756]: I0318 14:10:02.388988 4756 generic.go:334] "Generic (PLEG): container finished" podID="21801a7f-54fd-4ea9-92d5-be66dce1326d" containerID="868752be180b3ed97411de02996ed876c4709543caa7e4f3dbeeb6384b1cfd0d" exitCode=0 Mar 18 14:10:02 crc kubenswrapper[4756]: I0318 14:10:02.389073 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564050-f6gfb" event={"ID":"21801a7f-54fd-4ea9-92d5-be66dce1326d","Type":"ContainerDied","Data":"868752be180b3ed97411de02996ed876c4709543caa7e4f3dbeeb6384b1cfd0d"} Mar 18 14:10:03 crc kubenswrapper[4756]: I0318 14:10:03.664411 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564050-f6gfb" Mar 18 14:10:03 crc kubenswrapper[4756]: I0318 14:10:03.693490 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44xgj\" (UniqueName: \"kubernetes.io/projected/21801a7f-54fd-4ea9-92d5-be66dce1326d-kube-api-access-44xgj\") pod \"21801a7f-54fd-4ea9-92d5-be66dce1326d\" (UID: \"21801a7f-54fd-4ea9-92d5-be66dce1326d\") " Mar 18 14:10:03 crc kubenswrapper[4756]: I0318 14:10:03.700482 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21801a7f-54fd-4ea9-92d5-be66dce1326d-kube-api-access-44xgj" (OuterVolumeSpecName: "kube-api-access-44xgj") pod "21801a7f-54fd-4ea9-92d5-be66dce1326d" (UID: "21801a7f-54fd-4ea9-92d5-be66dce1326d"). InnerVolumeSpecName "kube-api-access-44xgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:10:03 crc kubenswrapper[4756]: I0318 14:10:03.794823 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44xgj\" (UniqueName: \"kubernetes.io/projected/21801a7f-54fd-4ea9-92d5-be66dce1326d-kube-api-access-44xgj\") on node \"crc\" DevicePath \"\"" Mar 18 14:10:04 crc kubenswrapper[4756]: I0318 14:10:04.405363 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564050-f6gfb" event={"ID":"21801a7f-54fd-4ea9-92d5-be66dce1326d","Type":"ContainerDied","Data":"60026872121a52268b432da31f099a7fc173ddbdc18cf455a51a7a3ed6f99f29"} Mar 18 14:10:04 crc kubenswrapper[4756]: I0318 14:10:04.405415 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60026872121a52268b432da31f099a7fc173ddbdc18cf455a51a7a3ed6f99f29" Mar 18 14:10:04 crc kubenswrapper[4756]: I0318 14:10:04.405446 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564050-f6gfb" Mar 18 14:10:04 crc kubenswrapper[4756]: I0318 14:10:04.726083 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564044-fs4jl"] Mar 18 14:10:04 crc kubenswrapper[4756]: I0318 14:10:04.729080 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564044-fs4jl"] Mar 18 14:10:05 crc kubenswrapper[4756]: I0318 14:10:05.328538 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f367985-a362-46d8-8dab-205cb7756e9e" path="/var/lib/kubelet/pods/0f367985-a362-46d8-8dab-205cb7756e9e/volumes" Mar 18 14:10:20 crc kubenswrapper[4756]: I0318 14:10:20.106489 4756 scope.go:117] "RemoveContainer" containerID="edc3229d8de13bc3179b6b88bae3146b8c256dadefa49b635551df9aeb16fa95" Mar 18 14:11:06 crc kubenswrapper[4756]: I0318 14:11:06.915804 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:11:06 crc kubenswrapper[4756]: I0318 14:11:06.916425 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:11:20 crc kubenswrapper[4756]: I0318 14:11:20.174504 4756 scope.go:117] "RemoveContainer" containerID="db41a032f116f2ee9e8f001ee85ed6982558149808fd2bac78fad8880a53578e" Mar 18 14:11:36 crc kubenswrapper[4756]: I0318 14:11:36.914835 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:11:36 crc kubenswrapper[4756]: I0318 14:11:36.915465 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:12:00 crc kubenswrapper[4756]: I0318 14:12:00.150328 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564052-m29g7"] Mar 18 14:12:00 crc kubenswrapper[4756]: E0318 14:12:00.151333 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21801a7f-54fd-4ea9-92d5-be66dce1326d" containerName="oc" Mar 18 14:12:00 crc kubenswrapper[4756]: I0318 14:12:00.151376 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="21801a7f-54fd-4ea9-92d5-be66dce1326d" containerName="oc" Mar 18 14:12:00 crc kubenswrapper[4756]: I0318 14:12:00.151623 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="21801a7f-54fd-4ea9-92d5-be66dce1326d" containerName="oc" Mar 18 14:12:00 crc kubenswrapper[4756]: I0318 14:12:00.152230 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564052-m29g7" Mar 18 14:12:00 crc kubenswrapper[4756]: I0318 14:12:00.155094 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:12:00 crc kubenswrapper[4756]: I0318 14:12:00.155371 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:12:00 crc kubenswrapper[4756]: I0318 14:12:00.155707 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:12:00 crc kubenswrapper[4756]: I0318 14:12:00.169436 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564052-m29g7"] Mar 18 14:12:00 crc kubenswrapper[4756]: I0318 14:12:00.257922 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmh7v\" (UniqueName: \"kubernetes.io/projected/f37f5eec-136f-4489-9945-c63457566f85-kube-api-access-dmh7v\") pod \"auto-csr-approver-29564052-m29g7\" (UID: \"f37f5eec-136f-4489-9945-c63457566f85\") " pod="openshift-infra/auto-csr-approver-29564052-m29g7" Mar 18 14:12:00 crc kubenswrapper[4756]: I0318 14:12:00.359784 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmh7v\" (UniqueName: \"kubernetes.io/projected/f37f5eec-136f-4489-9945-c63457566f85-kube-api-access-dmh7v\") pod \"auto-csr-approver-29564052-m29g7\" (UID: \"f37f5eec-136f-4489-9945-c63457566f85\") " pod="openshift-infra/auto-csr-approver-29564052-m29g7" Mar 18 14:12:00 crc kubenswrapper[4756]: I0318 14:12:00.386031 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmh7v\" (UniqueName: \"kubernetes.io/projected/f37f5eec-136f-4489-9945-c63457566f85-kube-api-access-dmh7v\") pod \"auto-csr-approver-29564052-m29g7\" (UID: \"f37f5eec-136f-4489-9945-c63457566f85\") " pod="openshift-infra/auto-csr-approver-29564052-m29g7" Mar 18 14:12:00 crc kubenswrapper[4756]: I0318 14:12:00.485466 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564052-m29g7" Mar 18 14:12:00 crc kubenswrapper[4756]: I0318 14:12:00.937998 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564052-m29g7"] Mar 18 14:12:01 crc kubenswrapper[4756]: I0318 14:12:01.163684 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564052-m29g7" event={"ID":"f37f5eec-136f-4489-9945-c63457566f85","Type":"ContainerStarted","Data":"23801b29f1f6a70a7dec2c65d3d2b8994ab0747302481b751808e21659a37e08"} Mar 18 14:12:03 crc kubenswrapper[4756]: I0318 14:12:03.179671 4756 generic.go:334] "Generic (PLEG): container finished" podID="f37f5eec-136f-4489-9945-c63457566f85" containerID="0e9092047ae6949d66cc88c7ad464f2d450e4c2d8c7f250e2110836600e2c6c5" exitCode=0 Mar 18 14:12:03 crc kubenswrapper[4756]: I0318 14:12:03.179829 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564052-m29g7" event={"ID":"f37f5eec-136f-4489-9945-c63457566f85","Type":"ContainerDied","Data":"0e9092047ae6949d66cc88c7ad464f2d450e4c2d8c7f250e2110836600e2c6c5"} Mar 18 14:12:04 crc kubenswrapper[4756]: I0318 14:12:04.399039 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564052-m29g7" Mar 18 14:12:04 crc kubenswrapper[4756]: I0318 14:12:04.513573 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmh7v\" (UniqueName: \"kubernetes.io/projected/f37f5eec-136f-4489-9945-c63457566f85-kube-api-access-dmh7v\") pod \"f37f5eec-136f-4489-9945-c63457566f85\" (UID: \"f37f5eec-136f-4489-9945-c63457566f85\") " Mar 18 14:12:04 crc kubenswrapper[4756]: I0318 14:12:04.521831 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37f5eec-136f-4489-9945-c63457566f85-kube-api-access-dmh7v" (OuterVolumeSpecName: "kube-api-access-dmh7v") pod "f37f5eec-136f-4489-9945-c63457566f85" (UID: "f37f5eec-136f-4489-9945-c63457566f85"). InnerVolumeSpecName "kube-api-access-dmh7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:12:04 crc kubenswrapper[4756]: I0318 14:12:04.614933 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmh7v\" (UniqueName: \"kubernetes.io/projected/f37f5eec-136f-4489-9945-c63457566f85-kube-api-access-dmh7v\") on node \"crc\" DevicePath \"\"" Mar 18 14:12:05 crc kubenswrapper[4756]: I0318 14:12:05.196817 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564052-m29g7" event={"ID":"f37f5eec-136f-4489-9945-c63457566f85","Type":"ContainerDied","Data":"23801b29f1f6a70a7dec2c65d3d2b8994ab0747302481b751808e21659a37e08"} Mar 18 14:12:05 crc kubenswrapper[4756]: I0318 14:12:05.196888 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23801b29f1f6a70a7dec2c65d3d2b8994ab0747302481b751808e21659a37e08" Mar 18 14:12:05 crc kubenswrapper[4756]: I0318 14:12:05.196916 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564052-m29g7" Mar 18 14:12:05 crc kubenswrapper[4756]: I0318 14:12:05.472479 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564046-6bpkl"] Mar 18 14:12:05 crc kubenswrapper[4756]: I0318 14:12:05.476447 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564046-6bpkl"] Mar 18 14:12:06 crc kubenswrapper[4756]: I0318 14:12:06.915947 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:12:06 crc kubenswrapper[4756]: I0318 14:12:06.916411 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:12:06 crc kubenswrapper[4756]: I0318 14:12:06.916495 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:12:06 crc kubenswrapper[4756]: I0318 14:12:06.917442 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5720abaa535d38bd7f462e46e5802b411c62642cbf7053424674cf7b459cf96b"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:12:06 crc kubenswrapper[4756]: I0318 14:12:06.917589 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://5720abaa535d38bd7f462e46e5802b411c62642cbf7053424674cf7b459cf96b" gracePeriod=600 Mar 18 14:12:07 crc kubenswrapper[4756]: I0318 14:12:07.211239 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="5720abaa535d38bd7f462e46e5802b411c62642cbf7053424674cf7b459cf96b" exitCode=0 Mar 18 14:12:07 crc kubenswrapper[4756]: I0318 14:12:07.211299 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"5720abaa535d38bd7f462e46e5802b411c62642cbf7053424674cf7b459cf96b"} Mar 18 14:12:07 crc kubenswrapper[4756]: I0318 14:12:07.211341 4756 scope.go:117] "RemoveContainer" containerID="7986574aedd1bbfebd839420217d09df3d36a81aa68ea117b5469df20091c844" Mar 18 14:12:07 crc kubenswrapper[4756]: I0318 14:12:07.325092 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe405f4-7a33-48cb-be35-37815487343f" path="/var/lib/kubelet/pods/abe405f4-7a33-48cb-be35-37815487343f/volumes" Mar 18 14:12:08 crc kubenswrapper[4756]: I0318 14:12:08.220638 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"617eebb4a8c3d04af231bb44e996daa1896f056ada27eee9b25a69c05455bb74"} Mar 18 14:12:20 crc kubenswrapper[4756]: I0318 14:12:20.252466 4756 scope.go:117] "RemoveContainer" containerID="786b7accb599d44b74db20cc44d7928d481a7f1829c31af618ebf0b67f372ddd" Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.788218 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv"] Mar 18 14:12:46 crc kubenswrapper[4756]: E0318 14:12:46.788950 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37f5eec-136f-4489-9945-c63457566f85" containerName="oc" Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.788963 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37f5eec-136f-4489-9945-c63457566f85" containerName="oc" Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.789056 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f37f5eec-136f-4489-9945-c63457566f85" containerName="oc" Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.789922 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.791462 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.798161 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv"] Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.891822 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da51cda2-ab80-4d1a-b074-534c572d5803-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv\" (UID: \"da51cda2-ab80-4d1a-b074-534c572d5803\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.891880 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da51cda2-ab80-4d1a-b074-534c572d5803-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv\" (UID: \"da51cda2-ab80-4d1a-b074-534c572d5803\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.891998 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9bck\" (UniqueName: \"kubernetes.io/projected/da51cda2-ab80-4d1a-b074-534c572d5803-kube-api-access-p9bck\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv\" (UID: \"da51cda2-ab80-4d1a-b074-534c572d5803\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.993866 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9bck\" (UniqueName: \"kubernetes.io/projected/da51cda2-ab80-4d1a-b074-534c572d5803-kube-api-access-p9bck\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv\" (UID: \"da51cda2-ab80-4d1a-b074-534c572d5803\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.993942 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da51cda2-ab80-4d1a-b074-534c572d5803-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv\" (UID: \"da51cda2-ab80-4d1a-b074-534c572d5803\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.993987 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da51cda2-ab80-4d1a-b074-534c572d5803-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv\" (UID: \"da51cda2-ab80-4d1a-b074-534c572d5803\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.994475 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da51cda2-ab80-4d1a-b074-534c572d5803-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv\" (UID: \"da51cda2-ab80-4d1a-b074-534c572d5803\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" Mar 18 14:12:46 crc kubenswrapper[4756]: I0318 14:12:46.994484 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da51cda2-ab80-4d1a-b074-534c572d5803-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv\" (UID: \"da51cda2-ab80-4d1a-b074-534c572d5803\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" Mar 18 14:12:47 crc kubenswrapper[4756]: I0318 14:12:47.014769 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9bck\" (UniqueName: \"kubernetes.io/projected/da51cda2-ab80-4d1a-b074-534c572d5803-kube-api-access-p9bck\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv\" (UID: \"da51cda2-ab80-4d1a-b074-534c572d5803\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" Mar 18 14:12:47 crc kubenswrapper[4756]: I0318 14:12:47.107642 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" Mar 18 14:12:47 crc kubenswrapper[4756]: I0318 14:12:47.267538 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv"] Mar 18 14:12:47 crc kubenswrapper[4756]: I0318 14:12:47.476369 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" event={"ID":"da51cda2-ab80-4d1a-b074-534c572d5803","Type":"ContainerStarted","Data":"edd9170af9b68e44d73fd645668c93c1a5b0607dbc65cfc4c461159bbf8493da"} Mar 18 14:12:47 crc kubenswrapper[4756]: I0318 14:12:47.476437 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" event={"ID":"da51cda2-ab80-4d1a-b074-534c572d5803","Type":"ContainerStarted","Data":"89b670f09da5f296a05ae84f1beef16b171429abf40008ed02117d076a711761"} Mar 18 14:12:48 crc kubenswrapper[4756]: I0318 14:12:48.485177 4756 generic.go:334] "Generic (PLEG): container finished" podID="da51cda2-ab80-4d1a-b074-534c572d5803" containerID="edd9170af9b68e44d73fd645668c93c1a5b0607dbc65cfc4c461159bbf8493da" exitCode=0 Mar 18 14:12:48 crc kubenswrapper[4756]: I0318 14:12:48.485243 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" event={"ID":"da51cda2-ab80-4d1a-b074-534c572d5803","Type":"ContainerDied","Data":"edd9170af9b68e44d73fd645668c93c1a5b0607dbc65cfc4c461159bbf8493da"} Mar 18 14:12:52 crc kubenswrapper[4756]: I0318 14:12:52.527342 4756 generic.go:334] "Generic (PLEG): container finished" podID="da51cda2-ab80-4d1a-b074-534c572d5803" containerID="13dc48b548a4009b7200020193e843370b216ad8ec58a4648fc4c2c21374c002" exitCode=0 Mar 18 14:12:52 crc kubenswrapper[4756]: I0318 14:12:52.527395 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" event={"ID":"da51cda2-ab80-4d1a-b074-534c572d5803","Type":"ContainerDied","Data":"13dc48b548a4009b7200020193e843370b216ad8ec58a4648fc4c2c21374c002"} Mar 18 14:12:53 crc kubenswrapper[4756]: I0318 14:12:53.539042 4756 generic.go:334] "Generic (PLEG): container finished" podID="da51cda2-ab80-4d1a-b074-534c572d5803" containerID="eb93ed9577572810ed35e3e9f7e2a1b9c75f1f5c10f29f9031c2a8f8e059c8ae" exitCode=0 Mar 18 14:12:53 crc kubenswrapper[4756]: I0318 14:12:53.539420 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" event={"ID":"da51cda2-ab80-4d1a-b074-534c572d5803","Type":"ContainerDied","Data":"eb93ed9577572810ed35e3e9f7e2a1b9c75f1f5c10f29f9031c2a8f8e059c8ae"} Mar 18 14:12:54 crc kubenswrapper[4756]: I0318 14:12:54.844552 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" Mar 18 14:12:54 crc kubenswrapper[4756]: I0318 14:12:54.996063 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da51cda2-ab80-4d1a-b074-534c572d5803-util\") pod \"da51cda2-ab80-4d1a-b074-534c572d5803\" (UID: \"da51cda2-ab80-4d1a-b074-534c572d5803\") " Mar 18 14:12:54 crc kubenswrapper[4756]: I0318 14:12:54.996236 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da51cda2-ab80-4d1a-b074-534c572d5803-bundle\") pod \"da51cda2-ab80-4d1a-b074-534c572d5803\" (UID: \"da51cda2-ab80-4d1a-b074-534c572d5803\") " Mar 18 14:12:54 crc kubenswrapper[4756]: I0318 14:12:54.996434 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9bck\" (UniqueName: \"kubernetes.io/projected/da51cda2-ab80-4d1a-b074-534c572d5803-kube-api-access-p9bck\") pod \"da51cda2-ab80-4d1a-b074-534c572d5803\" (UID: \"da51cda2-ab80-4d1a-b074-534c572d5803\") " Mar 18 14:12:55 crc kubenswrapper[4756]: I0318 14:12:55.001642 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da51cda2-ab80-4d1a-b074-534c572d5803-bundle" (OuterVolumeSpecName: "bundle") pod "da51cda2-ab80-4d1a-b074-534c572d5803" (UID: "da51cda2-ab80-4d1a-b074-534c572d5803"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:12:55 crc kubenswrapper[4756]: I0318 14:12:55.002585 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da51cda2-ab80-4d1a-b074-534c572d5803-kube-api-access-p9bck" (OuterVolumeSpecName: "kube-api-access-p9bck") pod "da51cda2-ab80-4d1a-b074-534c572d5803" (UID: "da51cda2-ab80-4d1a-b074-534c572d5803"). InnerVolumeSpecName "kube-api-access-p9bck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:12:55 crc kubenswrapper[4756]: I0318 14:12:55.019101 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da51cda2-ab80-4d1a-b074-534c572d5803-util" (OuterVolumeSpecName: "util") pod "da51cda2-ab80-4d1a-b074-534c572d5803" (UID: "da51cda2-ab80-4d1a-b074-534c572d5803"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:12:55 crc kubenswrapper[4756]: I0318 14:12:55.098578 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9bck\" (UniqueName: \"kubernetes.io/projected/da51cda2-ab80-4d1a-b074-534c572d5803-kube-api-access-p9bck\") on node \"crc\" DevicePath \"\"" Mar 18 14:12:55 crc kubenswrapper[4756]: I0318 14:12:55.098685 4756 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da51cda2-ab80-4d1a-b074-534c572d5803-util\") on node \"crc\" DevicePath \"\"" Mar 18 14:12:55 crc kubenswrapper[4756]: I0318 14:12:55.098708 4756 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da51cda2-ab80-4d1a-b074-534c572d5803-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:12:55 crc kubenswrapper[4756]: I0318 14:12:55.557673 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" event={"ID":"da51cda2-ab80-4d1a-b074-534c572d5803","Type":"ContainerDied","Data":"89b670f09da5f296a05ae84f1beef16b171429abf40008ed02117d076a711761"} Mar 18 14:12:55 crc kubenswrapper[4756]: I0318 14:12:55.558038 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89b670f09da5f296a05ae84f1beef16b171429abf40008ed02117d076a711761" Mar 18 14:12:55 crc kubenswrapper[4756]: I0318 14:12:55.557836 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv" Mar 18 14:12:59 crc kubenswrapper[4756]: I0318 14:12:59.921965 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hgh2m"] Mar 18 14:12:59 crc kubenswrapper[4756]: I0318 14:12:59.922579 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovn-controller" containerID="cri-o://89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073" gracePeriod=30 Mar 18 14:12:59 crc kubenswrapper[4756]: I0318 14:12:59.922594 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="northd" containerID="cri-o://fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b" gracePeriod=30 Mar 18 14:12:59 crc kubenswrapper[4756]: I0318 14:12:59.922625 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="kube-rbac-proxy-node" containerID="cri-o://a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d" gracePeriod=30 Mar 18 14:12:59 crc kubenswrapper[4756]: I0318 14:12:59.922689 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="nbdb" containerID="cri-o://1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad" gracePeriod=30 Mar 18 14:12:59 crc kubenswrapper[4756]: I0318 14:12:59.922660 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovn-acl-logging" containerID="cri-o://b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac" gracePeriod=30 Mar 18 14:12:59 crc kubenswrapper[4756]: I0318 14:12:59.922714 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="sbdb" containerID="cri-o://d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd" gracePeriod=30 Mar 18 14:12:59 crc kubenswrapper[4756]: I0318 14:12:59.922659 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8" gracePeriod=30 Mar 18 14:12:59 crc kubenswrapper[4756]: I0318 14:12:59.969842 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" containerID="cri-o://b382ec1fcb9424f71540b6ace5af00998ac0b7cf5b869c2c61709c03b6fa1c94" gracePeriod=30 Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.585361 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wz5hm_13703604-4b4e-4eb2-b311-88457b667918/kube-multus/2.log" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.585746 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wz5hm_13703604-4b4e-4eb2-b311-88457b667918/kube-multus/1.log" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.585784 4756 generic.go:334] "Generic (PLEG): container finished" podID="13703604-4b4e-4eb2-b311-88457b667918" containerID="6d271e322ff997b6b5d2c9dcc6a298d8e41b723e1cb2c048962de813499e1b54" exitCode=2 Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.585832 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wz5hm" event={"ID":"13703604-4b4e-4eb2-b311-88457b667918","Type":"ContainerDied","Data":"6d271e322ff997b6b5d2c9dcc6a298d8e41b723e1cb2c048962de813499e1b54"} Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.585868 4756 scope.go:117] "RemoveContainer" containerID="a126ca0e73bc313c3b86c281a1627b238bb55f129d6ef0226b530e34d2b9d629" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.586323 4756 scope.go:117] "RemoveContainer" containerID="6d271e322ff997b6b5d2c9dcc6a298d8e41b723e1cb2c048962de813499e1b54" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.586524 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wz5hm_openshift-multus(13703604-4b4e-4eb2-b311-88457b667918)\"" pod="openshift-multus/multus-wz5hm" podUID="13703604-4b4e-4eb2-b311-88457b667918" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.589147 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/3.log" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.590868 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovn-acl-logging/0.log" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.591375 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovn-controller/0.log" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592062 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerID="b382ec1fcb9424f71540b6ace5af00998ac0b7cf5b869c2c61709c03b6fa1c94" exitCode=0 Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592095 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerID="d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd" exitCode=0 Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592103 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerID="1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad" exitCode=0 Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592112 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerID="fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b" exitCode=0 Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592142 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerID="19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8" exitCode=0 Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592150 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerID="a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d" exitCode=0 Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592161 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerID="b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac" exitCode=143 Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592170 4756 generic.go:334] "Generic (PLEG): container finished" podID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerID="89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073" exitCode=143 Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592185 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"b382ec1fcb9424f71540b6ace5af00998ac0b7cf5b869c2c61709c03b6fa1c94"} Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592267 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd"} Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592283 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad"} Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592296 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b"} Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592308 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8"} Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592319 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d"} Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592331 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac"} Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.592343 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073"} Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.603243 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovnkube-controller/3.log" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.605049 4756 scope.go:117] "RemoveContainer" containerID="3129e85bfcbe3f7709080c1b1f2524e015819df06e9c57dcd03a719d44ab331d" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.607531 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovn-acl-logging/0.log" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.607961 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovn-controller/0.log" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.608536 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.670922 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mn5sl"] Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671162 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671184 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671195 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671203 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671215 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671224 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671235 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da51cda2-ab80-4d1a-b074-534c572d5803" containerName="extract" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671242 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="da51cda2-ab80-4d1a-b074-534c572d5803" containerName="extract" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671255 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671264 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671277 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="kubecfg-setup" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671284 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="kubecfg-setup" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671293 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671300 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671309 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da51cda2-ab80-4d1a-b074-534c572d5803" containerName="util" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671316 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="da51cda2-ab80-4d1a-b074-534c572d5803" containerName="util" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671326 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovn-acl-logging" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671335 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovn-acl-logging" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671345 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="kube-rbac-proxy-node" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671351 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="kube-rbac-proxy-node" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671362 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="sbdb" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671368 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="sbdb" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671375 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovn-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671381 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovn-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671390 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="northd" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671396 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="northd" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671402 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da51cda2-ab80-4d1a-b074-534c572d5803" containerName="pull" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671408 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="da51cda2-ab80-4d1a-b074-534c572d5803" containerName="pull" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671419 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="nbdb" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671425 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="nbdb" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671513 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671524 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovn-acl-logging" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671533 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671540 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671547 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671554 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671561 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="kube-rbac-proxy-node" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671568 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="northd" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671575 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="da51cda2-ab80-4d1a-b074-534c572d5803" containerName="extract" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671584 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="nbdb" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671593 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovn-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671615 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="sbdb" Mar 18 14:13:00 crc kubenswrapper[4756]: E0318 14:13:00.671700 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671707 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.671791 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" containerName="ovnkube-controller" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.673569 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769405 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-etc-openvswitch\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769442 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-cni-bin\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769469 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-cni-netd\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769484 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-slash\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769524 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-openvswitch\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769538 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-systemd\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769523 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769558 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-ovnkube-config\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769591 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-log-socket\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769589 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769620 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769632 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q6cv\" (UniqueName: \"kubernetes.io/projected/c7cf6c03-98fc-4724-acde-a38f32f87496-kube-api-access-9q6cv\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769636 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-slash" (OuterVolumeSpecName: "host-slash") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769658 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-log-socket" (OuterVolumeSpecName: "log-socket") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769652 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-env-overrides\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769725 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-systemd-units\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769760 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7cf6c03-98fc-4724-acde-a38f32f87496-ovn-node-metrics-cert\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769783 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-run-netns\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769812 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769852 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-ovnkube-script-lib\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769875 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-ovn\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769876 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769901 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769924 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-node-log" (OuterVolumeSpecName: "node-log") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769898 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-node-log\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769968 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-var-lib-openvswitch\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769876 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769947 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.769998 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-kubelet\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770024 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-run-ovn-kubernetes\") pod \"c7cf6c03-98fc-4724-acde-a38f32f87496\" (UID: \"c7cf6c03-98fc-4724-acde-a38f32f87496\") " Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770034 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770071 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770078 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770098 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770153 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770202 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770229 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-slash\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770260 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/061feedd-a872-42ca-8f94-4a73aec049e6-ovnkube-config\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770275 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/061feedd-a872-42ca-8f94-4a73aec049e6-ovnkube-script-lib\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770283 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770298 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-cni-netd\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770316 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/061feedd-a872-42ca-8f94-4a73aec049e6-ovn-node-metrics-cert\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770368 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-systemd-units\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770402 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-node-log\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770490 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770541 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-etc-openvswitch\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770563 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-run-netns\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770646 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-run-ovn\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770697 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/061feedd-a872-42ca-8f94-4a73aec049e6-env-overrides\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770716 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m24s9\" (UniqueName: \"kubernetes.io/projected/061feedd-a872-42ca-8f94-4a73aec049e6-kube-api-access-m24s9\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770771 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-run-systemd\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770826 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-cni-bin\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770853 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-run-openvswitch\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770877 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-log-socket\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770905 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-var-lib-openvswitch\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770926 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-kubelet\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.770967 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771076 4756 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771087 4756 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771095 4756 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771104 4756 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771128 4756 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771136 4756 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771145 4756 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771154 4756 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771162 4756 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771172 4756 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771181 4756 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771190 4756 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c7cf6c03-98fc-4724-acde-a38f32f87496-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771198 4756 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771207 4756 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771215 4756 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771224 4756 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.771232 4756 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.774314 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7cf6c03-98fc-4724-acde-a38f32f87496-kube-api-access-9q6cv" (OuterVolumeSpecName: "kube-api-access-9q6cv") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "kube-api-access-9q6cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.774439 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cf6c03-98fc-4724-acde-a38f32f87496-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.781054 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c7cf6c03-98fc-4724-acde-a38f32f87496" (UID: "c7cf6c03-98fc-4724-acde-a38f32f87496"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872517 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-cni-bin\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872580 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-run-openvswitch\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872601 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-log-socket\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872619 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-var-lib-openvswitch\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872634 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-kubelet\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872655 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872684 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-slash\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872691 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-cni-bin\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872711 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-run-openvswitch\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872698 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/061feedd-a872-42ca-8f94-4a73aec049e6-ovnkube-config\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872786 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/061feedd-a872-42ca-8f94-4a73aec049e6-ovnkube-script-lib\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872791 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-kubelet\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872827 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-log-socket\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872835 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-cni-netd\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872854 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-var-lib-openvswitch\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872870 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/061feedd-a872-42ca-8f94-4a73aec049e6-ovn-node-metrics-cert\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872880 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872924 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-systemd-units\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872952 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-node-log\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.872982 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873029 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-etc-openvswitch\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873053 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-run-netns\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873149 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-systemd-units\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873149 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-node-log\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873175 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-etc-openvswitch\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873188 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-cni-netd\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873216 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873219 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-run-netns\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873234 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-run-ovn\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873206 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-run-ovn\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873309 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/061feedd-a872-42ca-8f94-4a73aec049e6-env-overrides\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873340 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m24s9\" (UniqueName: \"kubernetes.io/projected/061feedd-a872-42ca-8f94-4a73aec049e6-kube-api-access-m24s9\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873378 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-run-systemd\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873342 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/061feedd-a872-42ca-8f94-4a73aec049e6-ovnkube-config\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873442 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-run-systemd\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873499 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7cf6c03-98fc-4724-acde-a38f32f87496-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873518 4756 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c7cf6c03-98fc-4724-acde-a38f32f87496-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873531 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q6cv\" (UniqueName: \"kubernetes.io/projected/c7cf6c03-98fc-4724-acde-a38f32f87496-kube-api-access-9q6cv\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873556 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/061feedd-a872-42ca-8f94-4a73aec049e6-ovnkube-script-lib\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873612 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/061feedd-a872-42ca-8f94-4a73aec049e6-host-slash\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.873664 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/061feedd-a872-42ca-8f94-4a73aec049e6-env-overrides\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.876591 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/061feedd-a872-42ca-8f94-4a73aec049e6-ovn-node-metrics-cert\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.889077 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m24s9\" (UniqueName: \"kubernetes.io/projected/061feedd-a872-42ca-8f94-4a73aec049e6-kube-api-access-m24s9\") pod \"ovnkube-node-mn5sl\" (UID: \"061feedd-a872-42ca-8f94-4a73aec049e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:00 crc kubenswrapper[4756]: I0318 14:13:00.985812 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.608769 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wz5hm_13703604-4b4e-4eb2-b311-88457b667918/kube-multus/2.log" Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.611603 4756 generic.go:334] "Generic (PLEG): container finished" podID="061feedd-a872-42ca-8f94-4a73aec049e6" containerID="99bfaa6e3fbd9cce56078d27f4c2e39d54949fd1c0a0ae1bfd20dde04d99f6db" exitCode=0 Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.611666 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" event={"ID":"061feedd-a872-42ca-8f94-4a73aec049e6","Type":"ContainerDied","Data":"99bfaa6e3fbd9cce56078d27f4c2e39d54949fd1c0a0ae1bfd20dde04d99f6db"} Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.611693 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" event={"ID":"061feedd-a872-42ca-8f94-4a73aec049e6","Type":"ContainerStarted","Data":"76f7d3642fb4cc38b0524612c81fd90d85f345fc0ffcd046ac51c79759139973"} Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.632187 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovn-acl-logging/0.log" Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.636625 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hgh2m_c7cf6c03-98fc-4724-acde-a38f32f87496/ovn-controller/0.log" Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.638305 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" event={"ID":"c7cf6c03-98fc-4724-acde-a38f32f87496","Type":"ContainerDied","Data":"c720882114d76f7e173d8ca834be30e689bace91f86da99fabcbad99c5d5c213"} Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.638344 4756 scope.go:117] "RemoveContainer" containerID="b382ec1fcb9424f71540b6ace5af00998ac0b7cf5b869c2c61709c03b6fa1c94" Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.638491 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hgh2m" Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.680839 4756 scope.go:117] "RemoveContainer" containerID="d5066ba483e04f698bfdaf05189c0cfa8c3ce051923996363cb1a9f34a7fa5cd" Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.708381 4756 scope.go:117] "RemoveContainer" containerID="1bb6f70dd1c487eb19fd710cebeacfd65b3209beec380ccdc8e4fb551feb84ad" Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.725182 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hgh2m"] Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.738811 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hgh2m"] Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.765290 4756 scope.go:117] "RemoveContainer" containerID="fa5c9dcc43fb2bbfbbbd6555f56b8a7ac9141f7b669b87980928c4f91bacea2b" Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.797477 4756 scope.go:117] "RemoveContainer" containerID="19a37a9fa2d19a8c1f3e47823756c1e734bbd70977320081eef90842d76f9be8" Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.809086 4756 scope.go:117] "RemoveContainer" containerID="a7c4a050018a9042b60beb7ae85a2e60e9a8c3ccc17574273209a6670d4a497d" Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.826486 4756 scope.go:117] "RemoveContainer" containerID="b14f55438b858ea74ea30362489039ef48fc3a9e266f16d7cd2c6f26d3ac2aac" Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.852249 4756 scope.go:117] "RemoveContainer" containerID="89e9e3b262f20573d98905b008d3957a3b6023f77e69b76d4125619010cb1073" Mar 18 14:13:01 crc kubenswrapper[4756]: I0318 14:13:01.868353 4756 scope.go:117] "RemoveContainer" containerID="78403c5f9d14742ff46af5df9d557ea1f1345343ee527a436124503a86ad376f" Mar 18 14:13:02 crc kubenswrapper[4756]: I0318 14:13:02.666334 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" event={"ID":"061feedd-a872-42ca-8f94-4a73aec049e6","Type":"ContainerStarted","Data":"032c15d6cf0fb886ab856a4bea4157b3fd8c99ee74524efe787a53a511d8ce94"} Mar 18 14:13:02 crc kubenswrapper[4756]: I0318 14:13:02.666630 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" event={"ID":"061feedd-a872-42ca-8f94-4a73aec049e6","Type":"ContainerStarted","Data":"f4bf5c2769d29511d836996c55f3c710c3525e52119f60c5a22c0f6c1074ba5d"} Mar 18 14:13:02 crc kubenswrapper[4756]: I0318 14:13:02.666643 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" event={"ID":"061feedd-a872-42ca-8f94-4a73aec049e6","Type":"ContainerStarted","Data":"998e4b8d3282af23681757cc05aeef89b5922bc5b79d013800292977ab9de911"} Mar 18 14:13:02 crc kubenswrapper[4756]: I0318 14:13:02.666652 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" event={"ID":"061feedd-a872-42ca-8f94-4a73aec049e6","Type":"ContainerStarted","Data":"4d94501812d69583ef89274061e63a5e9d93dfcc890f3b77bb2f1b9ee2ef96ea"} Mar 18 14:13:02 crc kubenswrapper[4756]: I0318 14:13:02.666661 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" event={"ID":"061feedd-a872-42ca-8f94-4a73aec049e6","Type":"ContainerStarted","Data":"0b96b106fe980e994da8168ecec58a95e980b7d5c478190e18191590898665cf"} Mar 18 14:13:02 crc kubenswrapper[4756]: I0318 14:13:02.666669 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" event={"ID":"061feedd-a872-42ca-8f94-4a73aec049e6","Type":"ContainerStarted","Data":"e6f9f330c88258f0385ead81b2b29adccb76d5772606d896d290b30920912008"} Mar 18 14:13:03 crc kubenswrapper[4756]: I0318 14:13:03.321602 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7cf6c03-98fc-4724-acde-a38f32f87496" path="/var/lib/kubelet/pods/c7cf6c03-98fc-4724-acde-a38f32f87496/volumes" Mar 18 14:13:04 crc kubenswrapper[4756]: I0318 14:13:04.680397 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" event={"ID":"061feedd-a872-42ca-8f94-4a73aec049e6","Type":"ContainerStarted","Data":"2bf8f6ad2dec1bfec49cd3132247408581f1d211914321779f06822103d7c6f5"} Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.598703 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd"] Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.599346 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.601243 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-g4ft9" Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.601930 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.602175 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.732033 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-247sn\" (UniqueName: \"kubernetes.io/projected/df2f3290-a194-4fa7-9c5c-533c329bc34b-kube-api-access-247sn\") pod \"obo-prometheus-operator-8ff7d675-6kvvd\" (UID: \"df2f3290-a194-4fa7-9c5c-533c329bc34b\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.833580 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-247sn\" (UniqueName: \"kubernetes.io/projected/df2f3290-a194-4fa7-9c5c-533c329bc34b-kube-api-access-247sn\") pod \"obo-prometheus-operator-8ff7d675-6kvvd\" (UID: \"df2f3290-a194-4fa7-9c5c-533c329bc34b\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.857152 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-247sn\" (UniqueName: \"kubernetes.io/projected/df2f3290-a194-4fa7-9c5c-533c329bc34b-kube-api-access-247sn\") pod \"obo-prometheus-operator-8ff7d675-6kvvd\" (UID: \"df2f3290-a194-4fa7-9c5c-533c329bc34b\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.933026 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:05 crc kubenswrapper[4756]: E0318 14:13:05.955166 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators_df2f3290-a194-4fa7-9c5c-533c329bc34b_0(927a318a24f007d9111d4c4ba1afdc7f2ffcab9d281fcf6204ae3e75601a6f6b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:05 crc kubenswrapper[4756]: E0318 14:13:05.955516 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators_df2f3290-a194-4fa7-9c5c-533c329bc34b_0(927a318a24f007d9111d4c4ba1afdc7f2ffcab9d281fcf6204ae3e75601a6f6b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:05 crc kubenswrapper[4756]: E0318 14:13:05.955536 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators_df2f3290-a194-4fa7-9c5c-533c329bc34b_0(927a318a24f007d9111d4c4ba1afdc7f2ffcab9d281fcf6204ae3e75601a6f6b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:05 crc kubenswrapper[4756]: E0318 14:13:05.955580 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators(df2f3290-a194-4fa7-9c5c-533c329bc34b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators(df2f3290-a194-4fa7-9c5c-533c329bc34b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators_df2f3290-a194-4fa7-9c5c-533c329bc34b_0(927a318a24f007d9111d4c4ba1afdc7f2ffcab9d281fcf6204ae3e75601a6f6b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" podUID="df2f3290-a194-4fa7-9c5c-533c329bc34b" Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.976446 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp"] Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.977088 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.979646 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-rvdt9" Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.979646 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.988805 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw"] Mar 18 14:13:05 crc kubenswrapper[4756]: I0318 14:13:05.989577 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.137916 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1ddfb6b-2173-4ddc-84aa-437858c62a2a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw\" (UID: \"b1ddfb6b-2173-4ddc-84aa-437858c62a2a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.137976 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1ddfb6b-2173-4ddc-84aa-437858c62a2a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw\" (UID: \"b1ddfb6b-2173-4ddc-84aa-437858c62a2a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.138012 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7cef672e-9e83-4a19-90e0-8d078a871e02-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp\" (UID: \"7cef672e-9e83-4a19-90e0-8d078a871e02\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.138072 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7cef672e-9e83-4a19-90e0-8d078a871e02-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp\" (UID: \"7cef672e-9e83-4a19-90e0-8d078a871e02\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.239072 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7cef672e-9e83-4a19-90e0-8d078a871e02-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp\" (UID: \"7cef672e-9e83-4a19-90e0-8d078a871e02\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.239205 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1ddfb6b-2173-4ddc-84aa-437858c62a2a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw\" (UID: \"b1ddfb6b-2173-4ddc-84aa-437858c62a2a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.239242 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1ddfb6b-2173-4ddc-84aa-437858c62a2a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw\" (UID: \"b1ddfb6b-2173-4ddc-84aa-437858c62a2a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.239268 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7cef672e-9e83-4a19-90e0-8d078a871e02-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp\" (UID: \"7cef672e-9e83-4a19-90e0-8d078a871e02\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.243349 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1ddfb6b-2173-4ddc-84aa-437858c62a2a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw\" (UID: \"b1ddfb6b-2173-4ddc-84aa-437858c62a2a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.243350 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7cef672e-9e83-4a19-90e0-8d078a871e02-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp\" (UID: \"7cef672e-9e83-4a19-90e0-8d078a871e02\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.245916 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1ddfb6b-2173-4ddc-84aa-437858c62a2a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw\" (UID: \"b1ddfb6b-2173-4ddc-84aa-437858c62a2a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.246702 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7cef672e-9e83-4a19-90e0-8d078a871e02-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp\" (UID: \"7cef672e-9e83-4a19-90e0-8d078a871e02\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.279883 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-7rvzd"] Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.280532 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.281965 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-xfnpf" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.282626 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.293046 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.304624 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:06 crc kubenswrapper[4756]: E0318 14:13:06.322253 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators_7cef672e-9e83-4a19-90e0-8d078a871e02_0(6728ddb18a387822c616e5b0e4d28621d7c6ab5129845c2fd60d7499f9d7ac9e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:06 crc kubenswrapper[4756]: E0318 14:13:06.322305 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators_7cef672e-9e83-4a19-90e0-8d078a871e02_0(6728ddb18a387822c616e5b0e4d28621d7c6ab5129845c2fd60d7499f9d7ac9e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:06 crc kubenswrapper[4756]: E0318 14:13:06.322324 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators_7cef672e-9e83-4a19-90e0-8d078a871e02_0(6728ddb18a387822c616e5b0e4d28621d7c6ab5129845c2fd60d7499f9d7ac9e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:06 crc kubenswrapper[4756]: E0318 14:13:06.322368 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators(7cef672e-9e83-4a19-90e0-8d078a871e02)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators(7cef672e-9e83-4a19-90e0-8d078a871e02)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators_7cef672e-9e83-4a19-90e0-8d078a871e02_0(6728ddb18a387822c616e5b0e4d28621d7c6ab5129845c2fd60d7499f9d7ac9e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" podUID="7cef672e-9e83-4a19-90e0-8d078a871e02" Mar 18 14:13:06 crc kubenswrapper[4756]: E0318 14:13:06.333481 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators_b1ddfb6b-2173-4ddc-84aa-437858c62a2a_0(9f3f9bf4c634b418615c864f3df677912c480a83bc20dd69c461f3a4c526c1d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:06 crc kubenswrapper[4756]: E0318 14:13:06.333553 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators_b1ddfb6b-2173-4ddc-84aa-437858c62a2a_0(9f3f9bf4c634b418615c864f3df677912c480a83bc20dd69c461f3a4c526c1d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:06 crc kubenswrapper[4756]: E0318 14:13:06.333580 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators_b1ddfb6b-2173-4ddc-84aa-437858c62a2a_0(9f3f9bf4c634b418615c864f3df677912c480a83bc20dd69c461f3a4c526c1d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:06 crc kubenswrapper[4756]: E0318 14:13:06.333641 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators(b1ddfb6b-2173-4ddc-84aa-437858c62a2a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators(b1ddfb6b-2173-4ddc-84aa-437858c62a2a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators_b1ddfb6b-2173-4ddc-84aa-437858c62a2a_0(9f3f9bf4c634b418615c864f3df677912c480a83bc20dd69c461f3a4c526c1d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" podUID="b1ddfb6b-2173-4ddc-84aa-437858c62a2a" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.441147 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc5k5\" (UniqueName: \"kubernetes.io/projected/626fab88-a2be-43fb-9679-6324c7105bd9-kube-api-access-dc5k5\") pod \"observability-operator-6dd7dd855f-7rvzd\" (UID: \"626fab88-a2be-43fb-9679-6324c7105bd9\") " pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.441256 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/626fab88-a2be-43fb-9679-6324c7105bd9-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-7rvzd\" (UID: \"626fab88-a2be-43fb-9679-6324c7105bd9\") " pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.543039 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc5k5\" (UniqueName: \"kubernetes.io/projected/626fab88-a2be-43fb-9679-6324c7105bd9-kube-api-access-dc5k5\") pod \"observability-operator-6dd7dd855f-7rvzd\" (UID: \"626fab88-a2be-43fb-9679-6324c7105bd9\") " pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.543084 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/626fab88-a2be-43fb-9679-6324c7105bd9-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-7rvzd\" (UID: \"626fab88-a2be-43fb-9679-6324c7105bd9\") " pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.548797 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/626fab88-a2be-43fb-9679-6324c7105bd9-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-7rvzd\" (UID: \"626fab88-a2be-43fb-9679-6324c7105bd9\") " pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.574812 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc5k5\" (UniqueName: \"kubernetes.io/projected/626fab88-a2be-43fb-9679-6324c7105bd9-kube-api-access-dc5k5\") pod \"observability-operator-6dd7dd855f-7rvzd\" (UID: \"626fab88-a2be-43fb-9679-6324c7105bd9\") " pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.598385 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:06 crc kubenswrapper[4756]: E0318 14:13:06.622261 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-7rvzd_openshift-operators_626fab88-a2be-43fb-9679-6324c7105bd9_0(9e3867b05f4d715d850b3e96700a9bb44e46085fd218f4a25da2d80008587dab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:06 crc kubenswrapper[4756]: E0318 14:13:06.622320 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-7rvzd_openshift-operators_626fab88-a2be-43fb-9679-6324c7105bd9_0(9e3867b05f4d715d850b3e96700a9bb44e46085fd218f4a25da2d80008587dab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:06 crc kubenswrapper[4756]: E0318 14:13:06.622342 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-7rvzd_openshift-operators_626fab88-a2be-43fb-9679-6324c7105bd9_0(9e3867b05f4d715d850b3e96700a9bb44e46085fd218f4a25da2d80008587dab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:06 crc kubenswrapper[4756]: E0318 14:13:06.622377 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-6dd7dd855f-7rvzd_openshift-operators(626fab88-a2be-43fb-9679-6324c7105bd9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-6dd7dd855f-7rvzd_openshift-operators(626fab88-a2be-43fb-9679-6324c7105bd9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-7rvzd_openshift-operators_626fab88-a2be-43fb-9679-6324c7105bd9_0(9e3867b05f4d715d850b3e96700a9bb44e46085fd218f4a25da2d80008587dab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" podUID="626fab88-a2be-43fb-9679-6324c7105bd9" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.660163 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-6cdcccbffc-hw5bz"] Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.660800 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.662355 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-v8bqk" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.663229 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.846379 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5044ac67-cd21-42cd-8fc4-63d7a532038d-apiservice-cert\") pod \"perses-operator-6cdcccbffc-hw5bz\" (UID: \"5044ac67-cd21-42cd-8fc4-63d7a532038d\") " pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.846740 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5044ac67-cd21-42cd-8fc4-63d7a532038d-webhook-cert\") pod \"perses-operator-6cdcccbffc-hw5bz\" (UID: \"5044ac67-cd21-42cd-8fc4-63d7a532038d\") " pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.846780 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8pvk\" (UniqueName: \"kubernetes.io/projected/5044ac67-cd21-42cd-8fc4-63d7a532038d-kube-api-access-r8pvk\") pod \"perses-operator-6cdcccbffc-hw5bz\" (UID: \"5044ac67-cd21-42cd-8fc4-63d7a532038d\") " pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.846860 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5044ac67-cd21-42cd-8fc4-63d7a532038d-openshift-service-ca\") pod \"perses-operator-6cdcccbffc-hw5bz\" (UID: \"5044ac67-cd21-42cd-8fc4-63d7a532038d\") " pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.948109 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5044ac67-cd21-42cd-8fc4-63d7a532038d-openshift-service-ca\") pod \"perses-operator-6cdcccbffc-hw5bz\" (UID: \"5044ac67-cd21-42cd-8fc4-63d7a532038d\") " pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.948198 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5044ac67-cd21-42cd-8fc4-63d7a532038d-apiservice-cert\") pod \"perses-operator-6cdcccbffc-hw5bz\" (UID: \"5044ac67-cd21-42cd-8fc4-63d7a532038d\") " pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.948228 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5044ac67-cd21-42cd-8fc4-63d7a532038d-webhook-cert\") pod \"perses-operator-6cdcccbffc-hw5bz\" (UID: \"5044ac67-cd21-42cd-8fc4-63d7a532038d\") " pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.948249 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8pvk\" (UniqueName: \"kubernetes.io/projected/5044ac67-cd21-42cd-8fc4-63d7a532038d-kube-api-access-r8pvk\") pod \"perses-operator-6cdcccbffc-hw5bz\" (UID: \"5044ac67-cd21-42cd-8fc4-63d7a532038d\") " pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.949166 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5044ac67-cd21-42cd-8fc4-63d7a532038d-openshift-service-ca\") pod \"perses-operator-6cdcccbffc-hw5bz\" (UID: \"5044ac67-cd21-42cd-8fc4-63d7a532038d\") " pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.960033 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5044ac67-cd21-42cd-8fc4-63d7a532038d-webhook-cert\") pod \"perses-operator-6cdcccbffc-hw5bz\" (UID: \"5044ac67-cd21-42cd-8fc4-63d7a532038d\") " pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.960563 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5044ac67-cd21-42cd-8fc4-63d7a532038d-apiservice-cert\") pod \"perses-operator-6cdcccbffc-hw5bz\" (UID: \"5044ac67-cd21-42cd-8fc4-63d7a532038d\") " pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.971969 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8pvk\" (UniqueName: \"kubernetes.io/projected/5044ac67-cd21-42cd-8fc4-63d7a532038d-kube-api-access-r8pvk\") pod \"perses-operator-6cdcccbffc-hw5bz\" (UID: \"5044ac67-cd21-42cd-8fc4-63d7a532038d\") " pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:06 crc kubenswrapper[4756]: I0318 14:13:06.974553 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.005066 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-6cdcccbffc-hw5bz_openshift-operators_5044ac67-cd21-42cd-8fc4-63d7a532038d_0(70571c40c204acdac1e101cec2b41851ccfe3e65a037fad7d7bcf9fdea9c473e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.005132 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-6cdcccbffc-hw5bz_openshift-operators_5044ac67-cd21-42cd-8fc4-63d7a532038d_0(70571c40c204acdac1e101cec2b41851ccfe3e65a037fad7d7bcf9fdea9c473e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.005187 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-6cdcccbffc-hw5bz_openshift-operators_5044ac67-cd21-42cd-8fc4-63d7a532038d_0(70571c40c204acdac1e101cec2b41851ccfe3e65a037fad7d7bcf9fdea9c473e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.005232 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-6cdcccbffc-hw5bz_openshift-operators(5044ac67-cd21-42cd-8fc4-63d7a532038d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-6cdcccbffc-hw5bz_openshift-operators(5044ac67-cd21-42cd-8fc4-63d7a532038d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-6cdcccbffc-hw5bz_openshift-operators_5044ac67-cd21-42cd-8fc4-63d7a532038d_0(70571c40c204acdac1e101cec2b41851ccfe3e65a037fad7d7bcf9fdea9c473e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" podUID="5044ac67-cd21-42cd-8fc4-63d7a532038d" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.615768 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-6cdcccbffc-hw5bz"] Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.637002 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw"] Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.637164 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.637614 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.647225 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd"] Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.647346 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.647709 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.652252 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp"] Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.652332 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.652682 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.665218 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-7rvzd"] Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.665323 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.665694 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.674229 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators_b1ddfb6b-2173-4ddc-84aa-437858c62a2a_0(af1ab9ef82f653e0c1401d843a86c6d43c65ef513821b208b542ef87488c2218): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.674280 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators_b1ddfb6b-2173-4ddc-84aa-437858c62a2a_0(af1ab9ef82f653e0c1401d843a86c6d43c65ef513821b208b542ef87488c2218): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.674302 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators_b1ddfb6b-2173-4ddc-84aa-437858c62a2a_0(af1ab9ef82f653e0c1401d843a86c6d43c65ef513821b208b542ef87488c2218): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.674336 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators(b1ddfb6b-2173-4ddc-84aa-437858c62a2a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators(b1ddfb6b-2173-4ddc-84aa-437858c62a2a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators_b1ddfb6b-2173-4ddc-84aa-437858c62a2a_0(af1ab9ef82f653e0c1401d843a86c6d43c65ef513821b208b542ef87488c2218): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" podUID="b1ddfb6b-2173-4ddc-84aa-437858c62a2a" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.724439 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators_df2f3290-a194-4fa7-9c5c-533c329bc34b_0(51a708f9eb2796b4ba6386dc89746e4f51a6d4e3a9064c82661de7ee0ed0902d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.724522 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators_df2f3290-a194-4fa7-9c5c-533c329bc34b_0(51a708f9eb2796b4ba6386dc89746e4f51a6d4e3a9064c82661de7ee0ed0902d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.724550 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators_df2f3290-a194-4fa7-9c5c-533c329bc34b_0(51a708f9eb2796b4ba6386dc89746e4f51a6d4e3a9064c82661de7ee0ed0902d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.724606 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators(df2f3290-a194-4fa7-9c5c-533c329bc34b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators(df2f3290-a194-4fa7-9c5c-533c329bc34b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators_df2f3290-a194-4fa7-9c5c-533c329bc34b_0(51a708f9eb2796b4ba6386dc89746e4f51a6d4e3a9064c82661de7ee0ed0902d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" podUID="df2f3290-a194-4fa7-9c5c-533c329bc34b" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.762419 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.762854 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.763889 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" event={"ID":"061feedd-a872-42ca-8f94-4a73aec049e6","Type":"ContainerStarted","Data":"80c5eaf061688fcb216654483509ce09d369c7deebbd40625ca94593aefdab2c"} Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.764045 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.764066 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.764075 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.774307 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-7rvzd_openshift-operators_626fab88-a2be-43fb-9679-6324c7105bd9_0(b16000ab018d94964cb7dcd39c3b2bf44b4cd93aba18ccee04a21e34b7276197): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.774373 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-7rvzd_openshift-operators_626fab88-a2be-43fb-9679-6324c7105bd9_0(b16000ab018d94964cb7dcd39c3b2bf44b4cd93aba18ccee04a21e34b7276197): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.774400 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-7rvzd_openshift-operators_626fab88-a2be-43fb-9679-6324c7105bd9_0(b16000ab018d94964cb7dcd39c3b2bf44b4cd93aba18ccee04a21e34b7276197): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.774447 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-6dd7dd855f-7rvzd_openshift-operators(626fab88-a2be-43fb-9679-6324c7105bd9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-6dd7dd855f-7rvzd_openshift-operators(626fab88-a2be-43fb-9679-6324c7105bd9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-7rvzd_openshift-operators_626fab88-a2be-43fb-9679-6324c7105bd9_0(b16000ab018d94964cb7dcd39c3b2bf44b4cd93aba18ccee04a21e34b7276197): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" podUID="626fab88-a2be-43fb-9679-6324c7105bd9" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.802956 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" podStartSLOduration=7.8029419 podStartE2EDuration="7.8029419s" podCreationTimestamp="2026-03-18 14:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:13:07.797558123 +0000 UTC m=+789.111976098" watchObservedRunningTime="2026-03-18 14:13:07.8029419 +0000 UTC m=+789.117359875" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.812281 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators_7cef672e-9e83-4a19-90e0-8d078a871e02_0(6be01cfa0310c5e6a1b7956f94978db72818b3f151937d1483798b23a492f076): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.812343 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators_7cef672e-9e83-4a19-90e0-8d078a871e02_0(6be01cfa0310c5e6a1b7956f94978db72818b3f151937d1483798b23a492f076): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.812362 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators_7cef672e-9e83-4a19-90e0-8d078a871e02_0(6be01cfa0310c5e6a1b7956f94978db72818b3f151937d1483798b23a492f076): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.812407 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators(7cef672e-9e83-4a19-90e0-8d078a871e02)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators(7cef672e-9e83-4a19-90e0-8d078a871e02)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators_7cef672e-9e83-4a19-90e0-8d078a871e02_0(6be01cfa0310c5e6a1b7956f94978db72818b3f151937d1483798b23a492f076): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" podUID="7cef672e-9e83-4a19-90e0-8d078a871e02" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.835258 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-6cdcccbffc-hw5bz_openshift-operators_5044ac67-cd21-42cd-8fc4-63d7a532038d_0(ffb2492b7753c0b2880527cce8fb11d75f349751e8837fcd3db5c8c750a9b4b8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.835325 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-6cdcccbffc-hw5bz_openshift-operators_5044ac67-cd21-42cd-8fc4-63d7a532038d_0(ffb2492b7753c0b2880527cce8fb11d75f349751e8837fcd3db5c8c750a9b4b8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.835343 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-6cdcccbffc-hw5bz_openshift-operators_5044ac67-cd21-42cd-8fc4-63d7a532038d_0(ffb2492b7753c0b2880527cce8fb11d75f349751e8837fcd3db5c8c750a9b4b8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:07 crc kubenswrapper[4756]: E0318 14:13:07.835384 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-6cdcccbffc-hw5bz_openshift-operators(5044ac67-cd21-42cd-8fc4-63d7a532038d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-6cdcccbffc-hw5bz_openshift-operators(5044ac67-cd21-42cd-8fc4-63d7a532038d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-6cdcccbffc-hw5bz_openshift-operators_5044ac67-cd21-42cd-8fc4-63d7a532038d_0(ffb2492b7753c0b2880527cce8fb11d75f349751e8837fcd3db5c8c750a9b4b8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" podUID="5044ac67-cd21-42cd-8fc4-63d7a532038d" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.846402 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:07 crc kubenswrapper[4756]: I0318 14:13:07.847311 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:14 crc kubenswrapper[4756]: I0318 14:13:14.315517 4756 scope.go:117] "RemoveContainer" containerID="6d271e322ff997b6b5d2c9dcc6a298d8e41b723e1cb2c048962de813499e1b54" Mar 18 14:13:14 crc kubenswrapper[4756]: E0318 14:13:14.316632 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wz5hm_openshift-multus(13703604-4b4e-4eb2-b311-88457b667918)\"" pod="openshift-multus/multus-wz5hm" podUID="13703604-4b4e-4eb2-b311-88457b667918" Mar 18 14:13:19 crc kubenswrapper[4756]: I0318 14:13:19.315021 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:19 crc kubenswrapper[4756]: I0318 14:13:19.315136 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:19 crc kubenswrapper[4756]: I0318 14:13:19.317758 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:19 crc kubenswrapper[4756]: I0318 14:13:19.317767 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:19 crc kubenswrapper[4756]: E0318 14:13:19.359310 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators_df2f3290-a194-4fa7-9c5c-533c329bc34b_0(4c52735612c451d8bd7bec198f56db66a8143c9e5ff416ec9aad9b6df244bf92): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:19 crc kubenswrapper[4756]: E0318 14:13:19.359410 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators_df2f3290-a194-4fa7-9c5c-533c329bc34b_0(4c52735612c451d8bd7bec198f56db66a8143c9e5ff416ec9aad9b6df244bf92): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:19 crc kubenswrapper[4756]: E0318 14:13:19.359459 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators_df2f3290-a194-4fa7-9c5c-533c329bc34b_0(4c52735612c451d8bd7bec198f56db66a8143c9e5ff416ec9aad9b6df244bf92): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:19 crc kubenswrapper[4756]: E0318 14:13:19.359508 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators(df2f3290-a194-4fa7-9c5c-533c329bc34b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators(df2f3290-a194-4fa7-9c5c-533c329bc34b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-6kvvd_openshift-operators_df2f3290-a194-4fa7-9c5c-533c329bc34b_0(4c52735612c451d8bd7bec198f56db66a8143c9e5ff416ec9aad9b6df244bf92): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" podUID="df2f3290-a194-4fa7-9c5c-533c329bc34b" Mar 18 14:13:19 crc kubenswrapper[4756]: E0318 14:13:19.359324 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators_b1ddfb6b-2173-4ddc-84aa-437858c62a2a_0(07eeadc77aac411a4feb83daa8bb9a614bbca4e9f7d33cd3d62f3311fe4813f8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:19 crc kubenswrapper[4756]: E0318 14:13:19.359659 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators_b1ddfb6b-2173-4ddc-84aa-437858c62a2a_0(07eeadc77aac411a4feb83daa8bb9a614bbca4e9f7d33cd3d62f3311fe4813f8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:19 crc kubenswrapper[4756]: E0318 14:13:19.359688 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators_b1ddfb6b-2173-4ddc-84aa-437858c62a2a_0(07eeadc77aac411a4feb83daa8bb9a614bbca4e9f7d33cd3d62f3311fe4813f8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:19 crc kubenswrapper[4756]: E0318 14:13:19.359741 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators(b1ddfb6b-2173-4ddc-84aa-437858c62a2a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators(b1ddfb6b-2173-4ddc-84aa-437858c62a2a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_openshift-operators_b1ddfb6b-2173-4ddc-84aa-437858c62a2a_0(07eeadc77aac411a4feb83daa8bb9a614bbca4e9f7d33cd3d62f3311fe4813f8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" podUID="b1ddfb6b-2173-4ddc-84aa-437858c62a2a" Mar 18 14:13:20 crc kubenswrapper[4756]: I0318 14:13:20.314365 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:20 crc kubenswrapper[4756]: I0318 14:13:20.314854 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:20 crc kubenswrapper[4756]: E0318 14:13:20.336713 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-7rvzd_openshift-operators_626fab88-a2be-43fb-9679-6324c7105bd9_0(7d4bb9b0ee426e82806ab5e307f0d32212f3c38713790fe6dea0d603e727f150): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:20 crc kubenswrapper[4756]: E0318 14:13:20.336765 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-7rvzd_openshift-operators_626fab88-a2be-43fb-9679-6324c7105bd9_0(7d4bb9b0ee426e82806ab5e307f0d32212f3c38713790fe6dea0d603e727f150): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:20 crc kubenswrapper[4756]: E0318 14:13:20.336785 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-7rvzd_openshift-operators_626fab88-a2be-43fb-9679-6324c7105bd9_0(7d4bb9b0ee426e82806ab5e307f0d32212f3c38713790fe6dea0d603e727f150): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:20 crc kubenswrapper[4756]: E0318 14:13:20.336827 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-6dd7dd855f-7rvzd_openshift-operators(626fab88-a2be-43fb-9679-6324c7105bd9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-6dd7dd855f-7rvzd_openshift-operators(626fab88-a2be-43fb-9679-6324c7105bd9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-7rvzd_openshift-operators_626fab88-a2be-43fb-9679-6324c7105bd9_0(7d4bb9b0ee426e82806ab5e307f0d32212f3c38713790fe6dea0d603e727f150): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" podUID="626fab88-a2be-43fb-9679-6324c7105bd9" Mar 18 14:13:21 crc kubenswrapper[4756]: I0318 14:13:21.315275 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:21 crc kubenswrapper[4756]: I0318 14:13:21.316270 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:21 crc kubenswrapper[4756]: E0318 14:13:21.345257 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-6cdcccbffc-hw5bz_openshift-operators_5044ac67-cd21-42cd-8fc4-63d7a532038d_0(7a863dc7edbce67daef65dd39a0b1e199d1dfac3fdf624882fa68da078c483f4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:21 crc kubenswrapper[4756]: E0318 14:13:21.345328 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-6cdcccbffc-hw5bz_openshift-operators_5044ac67-cd21-42cd-8fc4-63d7a532038d_0(7a863dc7edbce67daef65dd39a0b1e199d1dfac3fdf624882fa68da078c483f4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:21 crc kubenswrapper[4756]: E0318 14:13:21.345355 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-6cdcccbffc-hw5bz_openshift-operators_5044ac67-cd21-42cd-8fc4-63d7a532038d_0(7a863dc7edbce67daef65dd39a0b1e199d1dfac3fdf624882fa68da078c483f4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:21 crc kubenswrapper[4756]: E0318 14:13:21.345407 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-6cdcccbffc-hw5bz_openshift-operators(5044ac67-cd21-42cd-8fc4-63d7a532038d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-6cdcccbffc-hw5bz_openshift-operators(5044ac67-cd21-42cd-8fc4-63d7a532038d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-6cdcccbffc-hw5bz_openshift-operators_5044ac67-cd21-42cd-8fc4-63d7a532038d_0(7a863dc7edbce67daef65dd39a0b1e199d1dfac3fdf624882fa68da078c483f4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" podUID="5044ac67-cd21-42cd-8fc4-63d7a532038d" Mar 18 14:13:22 crc kubenswrapper[4756]: I0318 14:13:22.314910 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:22 crc kubenswrapper[4756]: I0318 14:13:22.315434 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:22 crc kubenswrapper[4756]: E0318 14:13:22.342479 4756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators_7cef672e-9e83-4a19-90e0-8d078a871e02_0(33273d2954dc88710f909a521aec9f90e1610985d34f976a9264708a66682c3f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 14:13:22 crc kubenswrapper[4756]: E0318 14:13:22.342548 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators_7cef672e-9e83-4a19-90e0-8d078a871e02_0(33273d2954dc88710f909a521aec9f90e1610985d34f976a9264708a66682c3f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:22 crc kubenswrapper[4756]: E0318 14:13:22.342574 4756 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators_7cef672e-9e83-4a19-90e0-8d078a871e02_0(33273d2954dc88710f909a521aec9f90e1610985d34f976a9264708a66682c3f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:22 crc kubenswrapper[4756]: E0318 14:13:22.342631 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators(7cef672e-9e83-4a19-90e0-8d078a871e02)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators(7cef672e-9e83-4a19-90e0-8d078a871e02)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_openshift-operators_7cef672e-9e83-4a19-90e0-8d078a871e02_0(33273d2954dc88710f909a521aec9f90e1610985d34f976a9264708a66682c3f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" podUID="7cef672e-9e83-4a19-90e0-8d078a871e02" Mar 18 14:13:25 crc kubenswrapper[4756]: I0318 14:13:25.314949 4756 scope.go:117] "RemoveContainer" containerID="6d271e322ff997b6b5d2c9dcc6a298d8e41b723e1cb2c048962de813499e1b54" Mar 18 14:13:25 crc kubenswrapper[4756]: I0318 14:13:25.850834 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wz5hm_13703604-4b4e-4eb2-b311-88457b667918/kube-multus/2.log" Mar 18 14:13:25 crc kubenswrapper[4756]: I0318 14:13:25.851186 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wz5hm" event={"ID":"13703604-4b4e-4eb2-b311-88457b667918","Type":"ContainerStarted","Data":"441f46feeda4cbbff47005d3df508f26fa2a1f42cffc04403d901c9e5c5ba69e"} Mar 18 14:13:30 crc kubenswrapper[4756]: I0318 14:13:30.315194 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:30 crc kubenswrapper[4756]: I0318 14:13:30.316113 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" Mar 18 14:13:30 crc kubenswrapper[4756]: I0318 14:13:30.561492 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd"] Mar 18 14:13:30 crc kubenswrapper[4756]: I0318 14:13:30.877592 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" event={"ID":"df2f3290-a194-4fa7-9c5c-533c329bc34b","Type":"ContainerStarted","Data":"0a0c3296b213286353717b6e1856cf6b8ed6c86ce2b92699e5e58d60c3b14393"} Mar 18 14:13:31 crc kubenswrapper[4756]: I0318 14:13:31.012721 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mn5sl" Mar 18 14:13:32 crc kubenswrapper[4756]: I0318 14:13:32.314870 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:32 crc kubenswrapper[4756]: I0318 14:13:32.316286 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:32 crc kubenswrapper[4756]: I0318 14:13:32.316986 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:32 crc kubenswrapper[4756]: I0318 14:13:32.317606 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" Mar 18 14:13:32 crc kubenswrapper[4756]: I0318 14:13:32.610989 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-6cdcccbffc-hw5bz"] Mar 18 14:13:32 crc kubenswrapper[4756]: W0318 14:13:32.619905 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5044ac67_cd21_42cd_8fc4_63d7a532038d.slice/crio-3157e7daedf13ec03d4365cc953a10b5e41acf50e781290889f4a70c54f81d69 WatchSource:0}: Error finding container 3157e7daedf13ec03d4365cc953a10b5e41acf50e781290889f4a70c54f81d69: Status 404 returned error can't find the container with id 3157e7daedf13ec03d4365cc953a10b5e41acf50e781290889f4a70c54f81d69 Mar 18 14:13:32 crc kubenswrapper[4756]: I0318 14:13:32.757651 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw"] Mar 18 14:13:32 crc kubenswrapper[4756]: W0318 14:13:32.762476 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1ddfb6b_2173_4ddc_84aa_437858c62a2a.slice/crio-2e4ef5e23a19c0fbb4100bcb3ac570f63467b0f9c5331125a28d1a95cb23cfe6 WatchSource:0}: Error finding container 2e4ef5e23a19c0fbb4100bcb3ac570f63467b0f9c5331125a28d1a95cb23cfe6: Status 404 returned error can't find the container with id 2e4ef5e23a19c0fbb4100bcb3ac570f63467b0f9c5331125a28d1a95cb23cfe6 Mar 18 14:13:32 crc kubenswrapper[4756]: I0318 14:13:32.900815 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" event={"ID":"b1ddfb6b-2173-4ddc-84aa-437858c62a2a","Type":"ContainerStarted","Data":"2e4ef5e23a19c0fbb4100bcb3ac570f63467b0f9c5331125a28d1a95cb23cfe6"} Mar 18 14:13:32 crc kubenswrapper[4756]: I0318 14:13:32.904999 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" event={"ID":"5044ac67-cd21-42cd-8fc4-63d7a532038d","Type":"ContainerStarted","Data":"3157e7daedf13ec03d4365cc953a10b5e41acf50e781290889f4a70c54f81d69"} Mar 18 14:13:34 crc kubenswrapper[4756]: I0318 14:13:34.315433 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:34 crc kubenswrapper[4756]: I0318 14:13:34.316476 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:35 crc kubenswrapper[4756]: I0318 14:13:35.314742 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:35 crc kubenswrapper[4756]: I0318 14:13:35.315295 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" Mar 18 14:13:35 crc kubenswrapper[4756]: I0318 14:13:35.538280 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-7rvzd"] Mar 18 14:13:35 crc kubenswrapper[4756]: W0318 14:13:35.548560 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod626fab88_a2be_43fb_9679_6324c7105bd9.slice/crio-b0d1cda2829cfa7568977e97ee7e864271864c2961444359464b614e73a27484 WatchSource:0}: Error finding container b0d1cda2829cfa7568977e97ee7e864271864c2961444359464b614e73a27484: Status 404 returned error can't find the container with id b0d1cda2829cfa7568977e97ee7e864271864c2961444359464b614e73a27484 Mar 18 14:13:35 crc kubenswrapper[4756]: I0318 14:13:35.584739 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp"] Mar 18 14:13:35 crc kubenswrapper[4756]: W0318 14:13:35.588875 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cef672e_9e83_4a19_90e0_8d078a871e02.slice/crio-ab4ba6a0c0b843f2bc314f2b9da5ac0778c22e1422d3717f6b8213f63bf9c067 WatchSource:0}: Error finding container ab4ba6a0c0b843f2bc314f2b9da5ac0778c22e1422d3717f6b8213f63bf9c067: Status 404 returned error can't find the container with id ab4ba6a0c0b843f2bc314f2b9da5ac0778c22e1422d3717f6b8213f63bf9c067 Mar 18 14:13:35 crc kubenswrapper[4756]: I0318 14:13:35.925541 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" event={"ID":"626fab88-a2be-43fb-9679-6324c7105bd9","Type":"ContainerStarted","Data":"b0d1cda2829cfa7568977e97ee7e864271864c2961444359464b614e73a27484"} Mar 18 14:13:35 crc kubenswrapper[4756]: I0318 14:13:35.926716 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" event={"ID":"7cef672e-9e83-4a19-90e0-8d078a871e02","Type":"ContainerStarted","Data":"ab4ba6a0c0b843f2bc314f2b9da5ac0778c22e1422d3717f6b8213f63bf9c067"} Mar 18 14:13:35 crc kubenswrapper[4756]: I0318 14:13:35.928086 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" event={"ID":"df2f3290-a194-4fa7-9c5c-533c329bc34b","Type":"ContainerStarted","Data":"c9f9126df51c47283bc760018ec1b900f5701d509b47b464a48726c346c354b9"} Mar 18 14:13:37 crc kubenswrapper[4756]: I0318 14:13:37.944896 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" event={"ID":"b1ddfb6b-2173-4ddc-84aa-437858c62a2a","Type":"ContainerStarted","Data":"ff98af6336031315e22f9735f8fa2b0d2f7721a4a87037c0b67f9a2499020b1d"} Mar 18 14:13:37 crc kubenswrapper[4756]: I0318 14:13:37.947092 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" event={"ID":"7cef672e-9e83-4a19-90e0-8d078a871e02","Type":"ContainerStarted","Data":"fd4cb0aea47c712f0b2e3f6673f83260857fc4474ce2cd848a27084a29ad2f64"} Mar 18 14:13:37 crc kubenswrapper[4756]: I0318 14:13:37.949044 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" event={"ID":"5044ac67-cd21-42cd-8fc4-63d7a532038d","Type":"ContainerStarted","Data":"532581c1f7412406642c137187f392141c298afea1a4beb0771fd6dedb72d6e0"} Mar 18 14:13:37 crc kubenswrapper[4756]: I0318 14:13:37.949394 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:37 crc kubenswrapper[4756]: I0318 14:13:37.968020 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-6kvvd" podStartSLOduration=28.16783943 podStartE2EDuration="32.968003132s" podCreationTimestamp="2026-03-18 14:13:05 +0000 UTC" firstStartedPulling="2026-03-18 14:13:30.575314964 +0000 UTC m=+811.889732949" lastFinishedPulling="2026-03-18 14:13:35.375478666 +0000 UTC m=+816.689896651" observedRunningTime="2026-03-18 14:13:35.944610372 +0000 UTC m=+817.259028477" watchObservedRunningTime="2026-03-18 14:13:37.968003132 +0000 UTC m=+819.282421107" Mar 18 14:13:37 crc kubenswrapper[4756]: I0318 14:13:37.972539 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw" podStartSLOduration=28.863186545 podStartE2EDuration="32.972476074s" podCreationTimestamp="2026-03-18 14:13:05 +0000 UTC" firstStartedPulling="2026-03-18 14:13:32.766698577 +0000 UTC m=+814.081116552" lastFinishedPulling="2026-03-18 14:13:36.875988106 +0000 UTC m=+818.190406081" observedRunningTime="2026-03-18 14:13:37.960496428 +0000 UTC m=+819.274914413" watchObservedRunningTime="2026-03-18 14:13:37.972476074 +0000 UTC m=+819.286894049" Mar 18 14:13:37 crc kubenswrapper[4756]: I0318 14:13:37.981003 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp" podStartSLOduration=31.468895141 podStartE2EDuration="32.980985125s" podCreationTimestamp="2026-03-18 14:13:05 +0000 UTC" firstStartedPulling="2026-03-18 14:13:35.591275616 +0000 UTC m=+816.905693591" lastFinishedPulling="2026-03-18 14:13:37.10336558 +0000 UTC m=+818.417783575" observedRunningTime="2026-03-18 14:13:37.977611873 +0000 UTC m=+819.292029858" watchObservedRunningTime="2026-03-18 14:13:37.980985125 +0000 UTC m=+819.295403110" Mar 18 14:13:38 crc kubenswrapper[4756]: I0318 14:13:38.001018 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" podStartSLOduration=27.74845264 podStartE2EDuration="32.000999969s" podCreationTimestamp="2026-03-18 14:13:06 +0000 UTC" firstStartedPulling="2026-03-18 14:13:32.623621301 +0000 UTC m=+813.938039276" lastFinishedPulling="2026-03-18 14:13:36.87616863 +0000 UTC m=+818.190586605" observedRunningTime="2026-03-18 14:13:37.997617097 +0000 UTC m=+819.312035072" watchObservedRunningTime="2026-03-18 14:13:38.000999969 +0000 UTC m=+819.315417954" Mar 18 14:13:40 crc kubenswrapper[4756]: I0318 14:13:40.979333 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" event={"ID":"626fab88-a2be-43fb-9679-6324c7105bd9","Type":"ContainerStarted","Data":"1ab3648fbaa2cfa2db761fc198e1275ef7e271cbc4f70deab3871b71d2add3a9"} Mar 18 14:13:40 crc kubenswrapper[4756]: I0318 14:13:40.979966 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:41 crc kubenswrapper[4756]: I0318 14:13:41.003217 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" podStartSLOduration=30.423079175 podStartE2EDuration="35.003197571s" podCreationTimestamp="2026-03-18 14:13:06 +0000 UTC" firstStartedPulling="2026-03-18 14:13:35.550499018 +0000 UTC m=+816.864916993" lastFinishedPulling="2026-03-18 14:13:40.130617414 +0000 UTC m=+821.445035389" observedRunningTime="2026-03-18 14:13:40.998414911 +0000 UTC m=+822.312832926" watchObservedRunningTime="2026-03-18 14:13:41.003197571 +0000 UTC m=+822.317615556" Mar 18 14:13:41 crc kubenswrapper[4756]: I0318 14:13:41.092806 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-7rvzd" Mar 18 14:13:45 crc kubenswrapper[4756]: I0318 14:13:45.838946 4756 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 14:13:46 crc kubenswrapper[4756]: I0318 14:13:46.978529 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-6cdcccbffc-hw5bz" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.064862 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-7t9v8"] Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.065843 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7t9v8" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.068321 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n75wg"] Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.068992 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n75wg" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.072207 4756 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-l4w69" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.072465 4756 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-mv8nk" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.072588 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.072800 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.082638 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n75wg"] Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.086061 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-9dtmc"] Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.086833 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-9dtmc" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.088619 4756 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-4xjqb" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.093346 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7t9v8"] Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.112647 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-9dtmc"] Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.200907 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ks4b\" (UniqueName: \"kubernetes.io/projected/0b2d2be8-2089-4bf6-9dec-ac1070616f89-kube-api-access-5ks4b\") pod \"cert-manager-webhook-687f57d79b-9dtmc\" (UID: \"0b2d2be8-2089-4bf6-9dec-ac1070616f89\") " pod="cert-manager/cert-manager-webhook-687f57d79b-9dtmc" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.201011 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xwbh\" (UniqueName: \"kubernetes.io/projected/59b96a93-c409-494f-9c33-3cf8612a5c3c-kube-api-access-4xwbh\") pod \"cert-manager-cainjector-cf98fcc89-n75wg\" (UID: \"59b96a93-c409-494f-9c33-3cf8612a5c3c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n75wg" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.201191 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmn4g\" (UniqueName: \"kubernetes.io/projected/456f41d7-b340-4611-914b-cb23b10b8644-kube-api-access-xmn4g\") pod \"cert-manager-858654f9db-7t9v8\" (UID: \"456f41d7-b340-4611-914b-cb23b10b8644\") " pod="cert-manager/cert-manager-858654f9db-7t9v8" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.302631 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmn4g\" (UniqueName: \"kubernetes.io/projected/456f41d7-b340-4611-914b-cb23b10b8644-kube-api-access-xmn4g\") pod \"cert-manager-858654f9db-7t9v8\" (UID: \"456f41d7-b340-4611-914b-cb23b10b8644\") " pod="cert-manager/cert-manager-858654f9db-7t9v8" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.302689 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ks4b\" (UniqueName: \"kubernetes.io/projected/0b2d2be8-2089-4bf6-9dec-ac1070616f89-kube-api-access-5ks4b\") pod \"cert-manager-webhook-687f57d79b-9dtmc\" (UID: \"0b2d2be8-2089-4bf6-9dec-ac1070616f89\") " pod="cert-manager/cert-manager-webhook-687f57d79b-9dtmc" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.302721 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xwbh\" (UniqueName: \"kubernetes.io/projected/59b96a93-c409-494f-9c33-3cf8612a5c3c-kube-api-access-4xwbh\") pod \"cert-manager-cainjector-cf98fcc89-n75wg\" (UID: \"59b96a93-c409-494f-9c33-3cf8612a5c3c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n75wg" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.326979 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xwbh\" (UniqueName: \"kubernetes.io/projected/59b96a93-c409-494f-9c33-3cf8612a5c3c-kube-api-access-4xwbh\") pod \"cert-manager-cainjector-cf98fcc89-n75wg\" (UID: \"59b96a93-c409-494f-9c33-3cf8612a5c3c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n75wg" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.327701 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmn4g\" (UniqueName: \"kubernetes.io/projected/456f41d7-b340-4611-914b-cb23b10b8644-kube-api-access-xmn4g\") pod \"cert-manager-858654f9db-7t9v8\" (UID: \"456f41d7-b340-4611-914b-cb23b10b8644\") " pod="cert-manager/cert-manager-858654f9db-7t9v8" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.329823 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ks4b\" (UniqueName: \"kubernetes.io/projected/0b2d2be8-2089-4bf6-9dec-ac1070616f89-kube-api-access-5ks4b\") pod \"cert-manager-webhook-687f57d79b-9dtmc\" (UID: \"0b2d2be8-2089-4bf6-9dec-ac1070616f89\") " pod="cert-manager/cert-manager-webhook-687f57d79b-9dtmc" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.382933 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7t9v8" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.389394 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n75wg" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.414405 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-9dtmc" Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.814623 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n75wg"] Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.873079 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7t9v8"] Mar 18 14:13:51 crc kubenswrapper[4756]: I0318 14:13:51.917861 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-9dtmc"] Mar 18 14:13:51 crc kubenswrapper[4756]: W0318 14:13:51.922020 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b2d2be8_2089_4bf6_9dec_ac1070616f89.slice/crio-d1652333bd0fda844c61ed77112b829bacd6414f1c307ffa7923f596078aa012 WatchSource:0}: Error finding container d1652333bd0fda844c61ed77112b829bacd6414f1c307ffa7923f596078aa012: Status 404 returned error can't find the container with id d1652333bd0fda844c61ed77112b829bacd6414f1c307ffa7923f596078aa012 Mar 18 14:13:52 crc kubenswrapper[4756]: I0318 14:13:52.048778 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-9dtmc" event={"ID":"0b2d2be8-2089-4bf6-9dec-ac1070616f89","Type":"ContainerStarted","Data":"d1652333bd0fda844c61ed77112b829bacd6414f1c307ffa7923f596078aa012"} Mar 18 14:13:52 crc kubenswrapper[4756]: I0318 14:13:52.050367 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7t9v8" event={"ID":"456f41d7-b340-4611-914b-cb23b10b8644","Type":"ContainerStarted","Data":"b0708659a592fa8c9c55a929df30e48b846369dd3d5dd411592756a1fef00ffa"} Mar 18 14:13:52 crc kubenswrapper[4756]: I0318 14:13:52.051337 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n75wg" event={"ID":"59b96a93-c409-494f-9c33-3cf8612a5c3c","Type":"ContainerStarted","Data":"850c17ffb479e683659aaf3e7e7c4386e9be93f6ba566ea4a92a900f3b187955"} Mar 18 14:13:56 crc kubenswrapper[4756]: I0318 14:13:56.082261 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-9dtmc" event={"ID":"0b2d2be8-2089-4bf6-9dec-ac1070616f89","Type":"ContainerStarted","Data":"cfc92a481826def5a2f61b31e1472f3612c5dc1186da6e61a6d2edd99ceae23e"} Mar 18 14:13:56 crc kubenswrapper[4756]: I0318 14:13:56.082810 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-9dtmc" Mar 18 14:13:56 crc kubenswrapper[4756]: I0318 14:13:56.085230 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7t9v8" event={"ID":"456f41d7-b340-4611-914b-cb23b10b8644","Type":"ContainerStarted","Data":"a3a113670b037e3f13b503175936029398c92bebbc4c6b42cd8f11386a792763"} Mar 18 14:13:56 crc kubenswrapper[4756]: I0318 14:13:56.086764 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n75wg" event={"ID":"59b96a93-c409-494f-9c33-3cf8612a5c3c","Type":"ContainerStarted","Data":"965defa789a19e25354ae23322b1daa5cd3c0ce6580317d6eb7fc714463f267d"} Mar 18 14:13:56 crc kubenswrapper[4756]: I0318 14:13:56.103585 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-9dtmc" podStartSLOduration=1.243167323 podStartE2EDuration="5.103566052s" podCreationTimestamp="2026-03-18 14:13:51 +0000 UTC" firstStartedPulling="2026-03-18 14:13:51.924414186 +0000 UTC m=+833.238832161" lastFinishedPulling="2026-03-18 14:13:55.784812895 +0000 UTC m=+837.099230890" observedRunningTime="2026-03-18 14:13:56.099256444 +0000 UTC m=+837.413674429" watchObservedRunningTime="2026-03-18 14:13:56.103566052 +0000 UTC m=+837.417984027" Mar 18 14:13:56 crc kubenswrapper[4756]: I0318 14:13:56.121332 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-7t9v8" podStartSLOduration=1.275840649 podStartE2EDuration="5.121313513s" podCreationTimestamp="2026-03-18 14:13:51 +0000 UTC" firstStartedPulling="2026-03-18 14:13:51.879168137 +0000 UTC m=+833.193586112" lastFinishedPulling="2026-03-18 14:13:55.724640991 +0000 UTC m=+837.039058976" observedRunningTime="2026-03-18 14:13:56.115347231 +0000 UTC m=+837.429765206" watchObservedRunningTime="2026-03-18 14:13:56.121313513 +0000 UTC m=+837.435731488" Mar 18 14:13:56 crc kubenswrapper[4756]: I0318 14:13:56.131618 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n75wg" podStartSLOduration=1.23205334 podStartE2EDuration="5.131597713s" podCreationTimestamp="2026-03-18 14:13:51 +0000 UTC" firstStartedPulling="2026-03-18 14:13:51.824218254 +0000 UTC m=+833.138636229" lastFinishedPulling="2026-03-18 14:13:55.723762627 +0000 UTC m=+837.038180602" observedRunningTime="2026-03-18 14:13:56.126224967 +0000 UTC m=+837.440642952" watchObservedRunningTime="2026-03-18 14:13:56.131597713 +0000 UTC m=+837.446015688" Mar 18 14:14:00 crc kubenswrapper[4756]: I0318 14:14:00.131698 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564054-kbzrm"] Mar 18 14:14:00 crc kubenswrapper[4756]: I0318 14:14:00.132943 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564054-kbzrm" Mar 18 14:14:00 crc kubenswrapper[4756]: I0318 14:14:00.137023 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:14:00 crc kubenswrapper[4756]: I0318 14:14:00.137201 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:14:00 crc kubenswrapper[4756]: I0318 14:14:00.137489 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:14:00 crc kubenswrapper[4756]: I0318 14:14:00.145926 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564054-kbzrm"] Mar 18 14:14:00 crc kubenswrapper[4756]: I0318 14:14:00.234500 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr9nk\" (UniqueName: \"kubernetes.io/projected/c1a89d50-a68b-4237-8259-af1876fd0f8e-kube-api-access-rr9nk\") pod \"auto-csr-approver-29564054-kbzrm\" (UID: \"c1a89d50-a68b-4237-8259-af1876fd0f8e\") " pod="openshift-infra/auto-csr-approver-29564054-kbzrm" Mar 18 14:14:00 crc kubenswrapper[4756]: I0318 14:14:00.335689 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr9nk\" (UniqueName: \"kubernetes.io/projected/c1a89d50-a68b-4237-8259-af1876fd0f8e-kube-api-access-rr9nk\") pod \"auto-csr-approver-29564054-kbzrm\" (UID: \"c1a89d50-a68b-4237-8259-af1876fd0f8e\") " pod="openshift-infra/auto-csr-approver-29564054-kbzrm" Mar 18 14:14:00 crc kubenswrapper[4756]: I0318 14:14:00.353246 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr9nk\" (UniqueName: \"kubernetes.io/projected/c1a89d50-a68b-4237-8259-af1876fd0f8e-kube-api-access-rr9nk\") pod \"auto-csr-approver-29564054-kbzrm\" (UID: \"c1a89d50-a68b-4237-8259-af1876fd0f8e\") " pod="openshift-infra/auto-csr-approver-29564054-kbzrm" Mar 18 14:14:00 crc kubenswrapper[4756]: I0318 14:14:00.456451 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564054-kbzrm" Mar 18 14:14:00 crc kubenswrapper[4756]: I0318 14:14:00.891777 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564054-kbzrm"] Mar 18 14:14:00 crc kubenswrapper[4756]: W0318 14:14:00.898628 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1a89d50_a68b_4237_8259_af1876fd0f8e.slice/crio-87241b56d606f230ac5ab1039de7d8d262269478229d95f98d942cbe7bef01c7 WatchSource:0}: Error finding container 87241b56d606f230ac5ab1039de7d8d262269478229d95f98d942cbe7bef01c7: Status 404 returned error can't find the container with id 87241b56d606f230ac5ab1039de7d8d262269478229d95f98d942cbe7bef01c7 Mar 18 14:14:01 crc kubenswrapper[4756]: I0318 14:14:01.141452 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564054-kbzrm" event={"ID":"c1a89d50-a68b-4237-8259-af1876fd0f8e","Type":"ContainerStarted","Data":"87241b56d606f230ac5ab1039de7d8d262269478229d95f98d942cbe7bef01c7"} Mar 18 14:14:01 crc kubenswrapper[4756]: I0318 14:14:01.418792 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-9dtmc" Mar 18 14:14:02 crc kubenswrapper[4756]: I0318 14:14:02.155207 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564054-kbzrm" event={"ID":"c1a89d50-a68b-4237-8259-af1876fd0f8e","Type":"ContainerStarted","Data":"799ebddf58392ceaef15766661e99efd77cb4050930d43e64dc63f69668d1500"} Mar 18 14:14:02 crc kubenswrapper[4756]: I0318 14:14:02.175170 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564054-kbzrm" podStartSLOduration=1.289332624 podStartE2EDuration="2.175147721s" podCreationTimestamp="2026-03-18 14:14:00 +0000 UTC" firstStartedPulling="2026-03-18 14:14:00.902454558 +0000 UTC m=+842.216872573" lastFinishedPulling="2026-03-18 14:14:01.788269685 +0000 UTC m=+843.102687670" observedRunningTime="2026-03-18 14:14:02.171131453 +0000 UTC m=+843.485549428" watchObservedRunningTime="2026-03-18 14:14:02.175147721 +0000 UTC m=+843.489565706" Mar 18 14:14:03 crc kubenswrapper[4756]: I0318 14:14:03.164371 4756 generic.go:334] "Generic (PLEG): container finished" podID="c1a89d50-a68b-4237-8259-af1876fd0f8e" containerID="799ebddf58392ceaef15766661e99efd77cb4050930d43e64dc63f69668d1500" exitCode=0 Mar 18 14:14:03 crc kubenswrapper[4756]: I0318 14:14:03.164435 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564054-kbzrm" event={"ID":"c1a89d50-a68b-4237-8259-af1876fd0f8e","Type":"ContainerDied","Data":"799ebddf58392ceaef15766661e99efd77cb4050930d43e64dc63f69668d1500"} Mar 18 14:14:04 crc kubenswrapper[4756]: I0318 14:14:04.467833 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564054-kbzrm" Mar 18 14:14:04 crc kubenswrapper[4756]: I0318 14:14:04.588745 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr9nk\" (UniqueName: \"kubernetes.io/projected/c1a89d50-a68b-4237-8259-af1876fd0f8e-kube-api-access-rr9nk\") pod \"c1a89d50-a68b-4237-8259-af1876fd0f8e\" (UID: \"c1a89d50-a68b-4237-8259-af1876fd0f8e\") " Mar 18 14:14:04 crc kubenswrapper[4756]: I0318 14:14:04.597167 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a89d50-a68b-4237-8259-af1876fd0f8e-kube-api-access-rr9nk" (OuterVolumeSpecName: "kube-api-access-rr9nk") pod "c1a89d50-a68b-4237-8259-af1876fd0f8e" (UID: "c1a89d50-a68b-4237-8259-af1876fd0f8e"). InnerVolumeSpecName "kube-api-access-rr9nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:14:04 crc kubenswrapper[4756]: I0318 14:14:04.690662 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr9nk\" (UniqueName: \"kubernetes.io/projected/c1a89d50-a68b-4237-8259-af1876fd0f8e-kube-api-access-rr9nk\") on node \"crc\" DevicePath \"\"" Mar 18 14:14:05 crc kubenswrapper[4756]: I0318 14:14:05.183721 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564054-kbzrm" event={"ID":"c1a89d50-a68b-4237-8259-af1876fd0f8e","Type":"ContainerDied","Data":"87241b56d606f230ac5ab1039de7d8d262269478229d95f98d942cbe7bef01c7"} Mar 18 14:14:05 crc kubenswrapper[4756]: I0318 14:14:05.184260 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87241b56d606f230ac5ab1039de7d8d262269478229d95f98d942cbe7bef01c7" Mar 18 14:14:05 crc kubenswrapper[4756]: I0318 14:14:05.183895 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564054-kbzrm" Mar 18 14:14:05 crc kubenswrapper[4756]: I0318 14:14:05.246039 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564048-nw25j"] Mar 18 14:14:05 crc kubenswrapper[4756]: I0318 14:14:05.254273 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564048-nw25j"] Mar 18 14:14:05 crc kubenswrapper[4756]: I0318 14:14:05.331037 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="208a151e-414a-426e-9133-9bbcecd3445e" path="/var/lib/kubelet/pods/208a151e-414a-426e-9133-9bbcecd3445e/volumes" Mar 18 14:14:20 crc kubenswrapper[4756]: I0318 14:14:20.337638 4756 scope.go:117] "RemoveContainer" containerID="a8d2fb1e8122ad15752e4e8b6d34e91a4d25bf4f6de9c2f45ee9fb7d9c296ead" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.471968 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d"] Mar 18 14:14:25 crc kubenswrapper[4756]: E0318 14:14:25.472401 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a89d50-a68b-4237-8259-af1876fd0f8e" containerName="oc" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.472413 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a89d50-a68b-4237-8259-af1876fd0f8e" containerName="oc" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.472518 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1a89d50-a68b-4237-8259-af1876fd0f8e" containerName="oc" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.473229 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.475734 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.507878 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d"] Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.592058 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frfs6\" (UniqueName: \"kubernetes.io/projected/81416b4c-681c-434a-9932-bbcc4ed16d11-kube-api-access-frfs6\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d\" (UID: \"81416b4c-681c-434a-9932-bbcc4ed16d11\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.592219 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81416b4c-681c-434a-9932-bbcc4ed16d11-util\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d\" (UID: \"81416b4c-681c-434a-9932-bbcc4ed16d11\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.592250 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81416b4c-681c-434a-9932-bbcc4ed16d11-bundle\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d\" (UID: \"81416b4c-681c-434a-9932-bbcc4ed16d11\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.693506 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81416b4c-681c-434a-9932-bbcc4ed16d11-util\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d\" (UID: \"81416b4c-681c-434a-9932-bbcc4ed16d11\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.693562 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81416b4c-681c-434a-9932-bbcc4ed16d11-bundle\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d\" (UID: \"81416b4c-681c-434a-9932-bbcc4ed16d11\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.693616 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frfs6\" (UniqueName: \"kubernetes.io/projected/81416b4c-681c-434a-9932-bbcc4ed16d11-kube-api-access-frfs6\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d\" (UID: \"81416b4c-681c-434a-9932-bbcc4ed16d11\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.694151 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81416b4c-681c-434a-9932-bbcc4ed16d11-util\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d\" (UID: \"81416b4c-681c-434a-9932-bbcc4ed16d11\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.694481 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81416b4c-681c-434a-9932-bbcc4ed16d11-bundle\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d\" (UID: \"81416b4c-681c-434a-9932-bbcc4ed16d11\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.716894 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frfs6\" (UniqueName: \"kubernetes.io/projected/81416b4c-681c-434a-9932-bbcc4ed16d11-kube-api-access-frfs6\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d\" (UID: \"81416b4c-681c-434a-9932-bbcc4ed16d11\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" Mar 18 14:14:25 crc kubenswrapper[4756]: I0318 14:14:25.809265 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" Mar 18 14:14:26 crc kubenswrapper[4756]: I0318 14:14:26.042836 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d"] Mar 18 14:14:26 crc kubenswrapper[4756]: W0318 14:14:26.050756 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81416b4c_681c_434a_9932_bbcc4ed16d11.slice/crio-298ce9a648e5c126d60b0d73937c633ffc86771d113c5a7aa54aacc53e9029b3 WatchSource:0}: Error finding container 298ce9a648e5c126d60b0d73937c633ffc86771d113c5a7aa54aacc53e9029b3: Status 404 returned error can't find the container with id 298ce9a648e5c126d60b0d73937c633ffc86771d113c5a7aa54aacc53e9029b3 Mar 18 14:14:26 crc kubenswrapper[4756]: I0318 14:14:26.348311 4756 generic.go:334] "Generic (PLEG): container finished" podID="81416b4c-681c-434a-9932-bbcc4ed16d11" containerID="8aa6d0fe42f416fa4d7b99d676b318bffac8dda867c4c285166f079eeeebb85b" exitCode=0 Mar 18 14:14:26 crc kubenswrapper[4756]: I0318 14:14:26.348352 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" event={"ID":"81416b4c-681c-434a-9932-bbcc4ed16d11","Type":"ContainerDied","Data":"8aa6d0fe42f416fa4d7b99d676b318bffac8dda867c4c285166f079eeeebb85b"} Mar 18 14:14:26 crc kubenswrapper[4756]: I0318 14:14:26.348374 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" event={"ID":"81416b4c-681c-434a-9932-bbcc4ed16d11","Type":"ContainerStarted","Data":"298ce9a648e5c126d60b0d73937c633ffc86771d113c5a7aa54aacc53e9029b3"} Mar 18 14:14:27 crc kubenswrapper[4756]: I0318 14:14:27.738890 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zfdz4"] Mar 18 14:14:27 crc kubenswrapper[4756]: I0318 14:14:27.740514 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:27 crc kubenswrapper[4756]: I0318 14:14:27.752890 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zfdz4"] Mar 18 14:14:27 crc kubenswrapper[4756]: I0318 14:14:27.922379 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-utilities\") pod \"redhat-operators-zfdz4\" (UID: \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\") " pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:27 crc kubenswrapper[4756]: I0318 14:14:27.922461 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-catalog-content\") pod \"redhat-operators-zfdz4\" (UID: \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\") " pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:27 crc kubenswrapper[4756]: I0318 14:14:27.922546 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtnqg\" (UniqueName: \"kubernetes.io/projected/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-kube-api-access-rtnqg\") pod \"redhat-operators-zfdz4\" (UID: \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\") " pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:27 crc kubenswrapper[4756]: I0318 14:14:27.956814 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 18 14:14:27 crc kubenswrapper[4756]: I0318 14:14:27.957686 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 18 14:14:27 crc kubenswrapper[4756]: I0318 14:14:27.963853 4756 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-wwp67" Mar 18 14:14:27 crc kubenswrapper[4756]: I0318 14:14:27.964054 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 18 14:14:27 crc kubenswrapper[4756]: I0318 14:14:27.964280 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 18 14:14:27 crc kubenswrapper[4756]: I0318 14:14:27.967200 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.023509 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-utilities\") pod \"redhat-operators-zfdz4\" (UID: \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\") " pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.023575 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-catalog-content\") pod \"redhat-operators-zfdz4\" (UID: \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\") " pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.023635 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtnqg\" (UniqueName: \"kubernetes.io/projected/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-kube-api-access-rtnqg\") pod \"redhat-operators-zfdz4\" (UID: \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\") " pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.024104 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-utilities\") pod \"redhat-operators-zfdz4\" (UID: \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\") " pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.024156 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-catalog-content\") pod \"redhat-operators-zfdz4\" (UID: \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\") " pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.043962 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtnqg\" (UniqueName: \"kubernetes.io/projected/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-kube-api-access-rtnqg\") pod \"redhat-operators-zfdz4\" (UID: \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\") " pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.057332 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.124676 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e130c598-c1e2-4e82-b5cd-941bb81af0a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e130c598-c1e2-4e82-b5cd-941bb81af0a2\") pod \"minio\" (UID: \"586afc48-b008-4373-b2e9-a33285cbc666\") " pod="minio-dev/minio" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.124921 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmvb4\" (UniqueName: \"kubernetes.io/projected/586afc48-b008-4373-b2e9-a33285cbc666-kube-api-access-gmvb4\") pod \"minio\" (UID: \"586afc48-b008-4373-b2e9-a33285cbc666\") " pod="minio-dev/minio" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.226180 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e130c598-c1e2-4e82-b5cd-941bb81af0a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e130c598-c1e2-4e82-b5cd-941bb81af0a2\") pod \"minio\" (UID: \"586afc48-b008-4373-b2e9-a33285cbc666\") " pod="minio-dev/minio" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.226416 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmvb4\" (UniqueName: \"kubernetes.io/projected/586afc48-b008-4373-b2e9-a33285cbc666-kube-api-access-gmvb4\") pod \"minio\" (UID: \"586afc48-b008-4373-b2e9-a33285cbc666\") " pod="minio-dev/minio" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.228922 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.228972 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e130c598-c1e2-4e82-b5cd-941bb81af0a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e130c598-c1e2-4e82-b5cd-941bb81af0a2\") pod \"minio\" (UID: \"586afc48-b008-4373-b2e9-a33285cbc666\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/daabc4ae7611126c26b5f56a0a72231a991efd0a71a2b0589dfdcca6557482a4/globalmount\"" pod="minio-dev/minio" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.246048 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmvb4\" (UniqueName: \"kubernetes.io/projected/586afc48-b008-4373-b2e9-a33285cbc666-kube-api-access-gmvb4\") pod \"minio\" (UID: \"586afc48-b008-4373-b2e9-a33285cbc666\") " pod="minio-dev/minio" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.259073 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e130c598-c1e2-4e82-b5cd-941bb81af0a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e130c598-c1e2-4e82-b5cd-941bb81af0a2\") pod \"minio\" (UID: \"586afc48-b008-4373-b2e9-a33285cbc666\") " pod="minio-dev/minio" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.272780 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.957509 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 18 14:14:28 crc kubenswrapper[4756]: I0318 14:14:28.998105 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zfdz4"] Mar 18 14:14:29 crc kubenswrapper[4756]: W0318 14:14:29.019931 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0166b9f_2fc7_4d1c_9b81_8267ddbaaeef.slice/crio-bda2db6e1e31125535d8bcae2beccd891c45607eda5abac1dd9726dab6e95ec2 WatchSource:0}: Error finding container bda2db6e1e31125535d8bcae2beccd891c45607eda5abac1dd9726dab6e95ec2: Status 404 returned error can't find the container with id bda2db6e1e31125535d8bcae2beccd891c45607eda5abac1dd9726dab6e95ec2 Mar 18 14:14:29 crc kubenswrapper[4756]: W0318 14:14:29.020451 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod586afc48_b008_4373_b2e9_a33285cbc666.slice/crio-f9d00de05f2db890fbd02636f673a50ee73565197958d550962dbb2b8f8c8b8f WatchSource:0}: Error finding container f9d00de05f2db890fbd02636f673a50ee73565197958d550962dbb2b8f8c8b8f: Status 404 returned error can't find the container with id f9d00de05f2db890fbd02636f673a50ee73565197958d550962dbb2b8f8c8b8f Mar 18 14:14:29 crc kubenswrapper[4756]: I0318 14:14:29.369281 4756 generic.go:334] "Generic (PLEG): container finished" podID="81416b4c-681c-434a-9932-bbcc4ed16d11" containerID="7781ea9e46e39fbb245fcb117cac35a07582e4356d8e7d573520d36ba51faa1a" exitCode=0 Mar 18 14:14:29 crc kubenswrapper[4756]: I0318 14:14:29.369341 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" event={"ID":"81416b4c-681c-434a-9932-bbcc4ed16d11","Type":"ContainerDied","Data":"7781ea9e46e39fbb245fcb117cac35a07582e4356d8e7d573520d36ba51faa1a"} Mar 18 14:14:29 crc kubenswrapper[4756]: I0318 14:14:29.371031 4756 generic.go:334] "Generic (PLEG): container finished" podID="f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" containerID="bf25a50e9bba3ca869feb637c71488856b249f583edae442d6dafe6ec215c800" exitCode=0 Mar 18 14:14:29 crc kubenswrapper[4756]: I0318 14:14:29.371160 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfdz4" event={"ID":"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef","Type":"ContainerDied","Data":"bf25a50e9bba3ca869feb637c71488856b249f583edae442d6dafe6ec215c800"} Mar 18 14:14:29 crc kubenswrapper[4756]: I0318 14:14:29.371197 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfdz4" event={"ID":"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef","Type":"ContainerStarted","Data":"bda2db6e1e31125535d8bcae2beccd891c45607eda5abac1dd9726dab6e95ec2"} Mar 18 14:14:29 crc kubenswrapper[4756]: I0318 14:14:29.372171 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"586afc48-b008-4373-b2e9-a33285cbc666","Type":"ContainerStarted","Data":"f9d00de05f2db890fbd02636f673a50ee73565197958d550962dbb2b8f8c8b8f"} Mar 18 14:14:30 crc kubenswrapper[4756]: I0318 14:14:30.395810 4756 generic.go:334] "Generic (PLEG): container finished" podID="81416b4c-681c-434a-9932-bbcc4ed16d11" containerID="c6cb171adc1bfed29103df044017e53eab0c8c8b4b1332933429f6356c5b4753" exitCode=0 Mar 18 14:14:30 crc kubenswrapper[4756]: I0318 14:14:30.395891 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" event={"ID":"81416b4c-681c-434a-9932-bbcc4ed16d11","Type":"ContainerDied","Data":"c6cb171adc1bfed29103df044017e53eab0c8c8b4b1332933429f6356c5b4753"} Mar 18 14:14:31 crc kubenswrapper[4756]: I0318 14:14:31.773847 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" Mar 18 14:14:31 crc kubenswrapper[4756]: I0318 14:14:31.879595 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frfs6\" (UniqueName: \"kubernetes.io/projected/81416b4c-681c-434a-9932-bbcc4ed16d11-kube-api-access-frfs6\") pod \"81416b4c-681c-434a-9932-bbcc4ed16d11\" (UID: \"81416b4c-681c-434a-9932-bbcc4ed16d11\") " Mar 18 14:14:31 crc kubenswrapper[4756]: I0318 14:14:31.879669 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81416b4c-681c-434a-9932-bbcc4ed16d11-bundle\") pod \"81416b4c-681c-434a-9932-bbcc4ed16d11\" (UID: \"81416b4c-681c-434a-9932-bbcc4ed16d11\") " Mar 18 14:14:31 crc kubenswrapper[4756]: I0318 14:14:31.879726 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81416b4c-681c-434a-9932-bbcc4ed16d11-util\") pod \"81416b4c-681c-434a-9932-bbcc4ed16d11\" (UID: \"81416b4c-681c-434a-9932-bbcc4ed16d11\") " Mar 18 14:14:31 crc kubenswrapper[4756]: I0318 14:14:31.880978 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81416b4c-681c-434a-9932-bbcc4ed16d11-bundle" (OuterVolumeSpecName: "bundle") pod "81416b4c-681c-434a-9932-bbcc4ed16d11" (UID: "81416b4c-681c-434a-9932-bbcc4ed16d11"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:14:31 crc kubenswrapper[4756]: I0318 14:14:31.887387 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81416b4c-681c-434a-9932-bbcc4ed16d11-kube-api-access-frfs6" (OuterVolumeSpecName: "kube-api-access-frfs6") pod "81416b4c-681c-434a-9932-bbcc4ed16d11" (UID: "81416b4c-681c-434a-9932-bbcc4ed16d11"). InnerVolumeSpecName "kube-api-access-frfs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:14:31 crc kubenswrapper[4756]: I0318 14:14:31.892870 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81416b4c-681c-434a-9932-bbcc4ed16d11-util" (OuterVolumeSpecName: "util") pod "81416b4c-681c-434a-9932-bbcc4ed16d11" (UID: "81416b4c-681c-434a-9932-bbcc4ed16d11"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:14:31 crc kubenswrapper[4756]: I0318 14:14:31.981027 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frfs6\" (UniqueName: \"kubernetes.io/projected/81416b4c-681c-434a-9932-bbcc4ed16d11-kube-api-access-frfs6\") on node \"crc\" DevicePath \"\"" Mar 18 14:14:31 crc kubenswrapper[4756]: I0318 14:14:31.981299 4756 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81416b4c-681c-434a-9932-bbcc4ed16d11-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:14:31 crc kubenswrapper[4756]: I0318 14:14:31.981309 4756 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81416b4c-681c-434a-9932-bbcc4ed16d11-util\") on node \"crc\" DevicePath \"\"" Mar 18 14:14:32 crc kubenswrapper[4756]: I0318 14:14:32.420209 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" Mar 18 14:14:32 crc kubenswrapper[4756]: I0318 14:14:32.420203 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d" event={"ID":"81416b4c-681c-434a-9932-bbcc4ed16d11","Type":"ContainerDied","Data":"298ce9a648e5c126d60b0d73937c633ffc86771d113c5a7aa54aacc53e9029b3"} Mar 18 14:14:32 crc kubenswrapper[4756]: I0318 14:14:32.420404 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="298ce9a648e5c126d60b0d73937c633ffc86771d113c5a7aa54aacc53e9029b3" Mar 18 14:14:32 crc kubenswrapper[4756]: I0318 14:14:32.422970 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfdz4" event={"ID":"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef","Type":"ContainerStarted","Data":"31d2da239da146c06d8baaec2dd6b66d328068fb50bb91662d9ab780e57055bc"} Mar 18 14:14:32 crc kubenswrapper[4756]: I0318 14:14:32.426165 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"586afc48-b008-4373-b2e9-a33285cbc666","Type":"ContainerStarted","Data":"c1eb6eb4d41bff751047e123643b9d732c59d7ff3ec252e162d0138cd927f3a5"} Mar 18 14:14:32 crc kubenswrapper[4756]: I0318 14:14:32.473377 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.392710125 podStartE2EDuration="7.473347552s" podCreationTimestamp="2026-03-18 14:14:25 +0000 UTC" firstStartedPulling="2026-03-18 14:14:29.02265317 +0000 UTC m=+870.337071145" lastFinishedPulling="2026-03-18 14:14:32.103290607 +0000 UTC m=+873.417708572" observedRunningTime="2026-03-18 14:14:32.466387719 +0000 UTC m=+873.780805734" watchObservedRunningTime="2026-03-18 14:14:32.473347552 +0000 UTC m=+873.787765567" Mar 18 14:14:33 crc kubenswrapper[4756]: I0318 14:14:33.433907 4756 generic.go:334] "Generic (PLEG): container finished" podID="f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" containerID="31d2da239da146c06d8baaec2dd6b66d328068fb50bb91662d9ab780e57055bc" exitCode=0 Mar 18 14:14:33 crc kubenswrapper[4756]: I0318 14:14:33.433967 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfdz4" event={"ID":"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef","Type":"ContainerDied","Data":"31d2da239da146c06d8baaec2dd6b66d328068fb50bb91662d9ab780e57055bc"} Mar 18 14:14:34 crc kubenswrapper[4756]: I0318 14:14:34.443012 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfdz4" event={"ID":"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef","Type":"ContainerStarted","Data":"cd77cd6a66eef61deea97c9af620d500ccde4cd5ad2d19782be2fd7cd20fe7c0"} Mar 18 14:14:34 crc kubenswrapper[4756]: I0318 14:14:34.461764 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zfdz4" podStartSLOduration=2.995632229 podStartE2EDuration="7.46174842s" podCreationTimestamp="2026-03-18 14:14:27 +0000 UTC" firstStartedPulling="2026-03-18 14:14:29.373573316 +0000 UTC m=+870.687991291" lastFinishedPulling="2026-03-18 14:14:33.839689507 +0000 UTC m=+875.154107482" observedRunningTime="2026-03-18 14:14:34.460090585 +0000 UTC m=+875.774508570" watchObservedRunningTime="2026-03-18 14:14:34.46174842 +0000 UTC m=+875.776166395" Mar 18 14:14:36 crc kubenswrapper[4756]: I0318 14:14:36.914791 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:14:36 crc kubenswrapper[4756]: I0318 14:14:36.915109 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.804137 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml"] Mar 18 14:14:37 crc kubenswrapper[4756]: E0318 14:14:37.811772 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81416b4c-681c-434a-9932-bbcc4ed16d11" containerName="pull" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.811817 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="81416b4c-681c-434a-9932-bbcc4ed16d11" containerName="pull" Mar 18 14:14:37 crc kubenswrapper[4756]: E0318 14:14:37.811848 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81416b4c-681c-434a-9932-bbcc4ed16d11" containerName="util" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.811864 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="81416b4c-681c-434a-9932-bbcc4ed16d11" containerName="util" Mar 18 14:14:37 crc kubenswrapper[4756]: E0318 14:14:37.811877 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81416b4c-681c-434a-9932-bbcc4ed16d11" containerName="extract" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.811885 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="81416b4c-681c-434a-9932-bbcc4ed16d11" containerName="extract" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.812207 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="81416b4c-681c-434a-9932-bbcc4ed16d11" containerName="extract" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.813789 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.818651 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-j48qj" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.819002 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.819167 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.819242 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.819331 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.819474 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.838935 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml"] Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.949946 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/84e944f2-90e7-4c7a-802d-703a8ef82200-manager-config\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.949986 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84e944f2-90e7-4c7a-802d-703a8ef82200-apiservice-cert\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.950012 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84e944f2-90e7-4c7a-802d-703a8ef82200-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.950193 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfr29\" (UniqueName: \"kubernetes.io/projected/84e944f2-90e7-4c7a-802d-703a8ef82200-kube-api-access-gfr29\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:37 crc kubenswrapper[4756]: I0318 14:14:37.950394 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84e944f2-90e7-4c7a-802d-703a8ef82200-webhook-cert\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.051681 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/84e944f2-90e7-4c7a-802d-703a8ef82200-manager-config\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.051729 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84e944f2-90e7-4c7a-802d-703a8ef82200-apiservice-cert\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.051753 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84e944f2-90e7-4c7a-802d-703a8ef82200-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.051779 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfr29\" (UniqueName: \"kubernetes.io/projected/84e944f2-90e7-4c7a-802d-703a8ef82200-kube-api-access-gfr29\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.051810 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84e944f2-90e7-4c7a-802d-703a8ef82200-webhook-cert\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.052973 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/84e944f2-90e7-4c7a-802d-703a8ef82200-manager-config\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.058337 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.058386 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.059845 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84e944f2-90e7-4c7a-802d-703a8ef82200-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.063647 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84e944f2-90e7-4c7a-802d-703a8ef82200-apiservice-cert\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.080679 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84e944f2-90e7-4c7a-802d-703a8ef82200-webhook-cert\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.087875 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfr29\" (UniqueName: \"kubernetes.io/projected/84e944f2-90e7-4c7a-802d-703a8ef82200-kube-api-access-gfr29\") pod \"loki-operator-controller-manager-7d4b6cd968-2lpml\" (UID: \"84e944f2-90e7-4c7a-802d-703a8ef82200\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.131152 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:38 crc kubenswrapper[4756]: I0318 14:14:38.551346 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml"] Mar 18 14:14:38 crc kubenswrapper[4756]: W0318 14:14:38.555381 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84e944f2_90e7_4c7a_802d_703a8ef82200.slice/crio-ac9cbf837165ff631a0f6df2fb9fda4734da26bff3e5d164b2e26369901bd8ed WatchSource:0}: Error finding container ac9cbf837165ff631a0f6df2fb9fda4734da26bff3e5d164b2e26369901bd8ed: Status 404 returned error can't find the container with id ac9cbf837165ff631a0f6df2fb9fda4734da26bff3e5d164b2e26369901bd8ed Mar 18 14:14:39 crc kubenswrapper[4756]: I0318 14:14:39.124953 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zfdz4" podUID="f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" containerName="registry-server" probeResult="failure" output=< Mar 18 14:14:39 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:14:39 crc kubenswrapper[4756]: > Mar 18 14:14:39 crc kubenswrapper[4756]: I0318 14:14:39.471513 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" event={"ID":"84e944f2-90e7-4c7a-802d-703a8ef82200","Type":"ContainerStarted","Data":"ac9cbf837165ff631a0f6df2fb9fda4734da26bff3e5d164b2e26369901bd8ed"} Mar 18 14:14:43 crc kubenswrapper[4756]: I0318 14:14:43.496085 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" event={"ID":"84e944f2-90e7-4c7a-802d-703a8ef82200","Type":"ContainerStarted","Data":"a8fee5caee5ed55b5a9d6572f3e26a8c70a60ed7800ccf6c7ad5ab194f997095"} Mar 18 14:14:48 crc kubenswrapper[4756]: I0318 14:14:48.107215 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:48 crc kubenswrapper[4756]: I0318 14:14:48.148763 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:50 crc kubenswrapper[4756]: I0318 14:14:50.533225 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zfdz4"] Mar 18 14:14:50 crc kubenswrapper[4756]: I0318 14:14:50.533804 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zfdz4" podUID="f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" containerName="registry-server" containerID="cri-o://cd77cd6a66eef61deea97c9af620d500ccde4cd5ad2d19782be2fd7cd20fe7c0" gracePeriod=2 Mar 18 14:14:50 crc kubenswrapper[4756]: I0318 14:14:50.562752 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" event={"ID":"84e944f2-90e7-4c7a-802d-703a8ef82200","Type":"ContainerStarted","Data":"243bd5ce470494745161732c55887273f6a9ab74fa20a276879b30dedeb830d6"} Mar 18 14:14:50 crc kubenswrapper[4756]: I0318 14:14:50.563556 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:50 crc kubenswrapper[4756]: I0318 14:14:50.566989 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" Mar 18 14:14:50 crc kubenswrapper[4756]: I0318 14:14:50.592197 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-7d4b6cd968-2lpml" podStartSLOduration=1.872585197 podStartE2EDuration="13.592112669s" podCreationTimestamp="2026-03-18 14:14:37 +0000 UTC" firstStartedPulling="2026-03-18 14:14:38.557790867 +0000 UTC m=+879.872208842" lastFinishedPulling="2026-03-18 14:14:50.277318329 +0000 UTC m=+891.591736314" observedRunningTime="2026-03-18 14:14:50.58850597 +0000 UTC m=+891.902923985" watchObservedRunningTime="2026-03-18 14:14:50.592112669 +0000 UTC m=+891.906530674" Mar 18 14:14:50 crc kubenswrapper[4756]: I0318 14:14:50.947651 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.048558 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-catalog-content\") pod \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\" (UID: \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\") " Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.048660 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-utilities\") pod \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\" (UID: \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\") " Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.048765 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtnqg\" (UniqueName: \"kubernetes.io/projected/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-kube-api-access-rtnqg\") pod \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\" (UID: \"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef\") " Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.049983 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-utilities" (OuterVolumeSpecName: "utilities") pod "f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" (UID: "f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.053751 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-kube-api-access-rtnqg" (OuterVolumeSpecName: "kube-api-access-rtnqg") pod "f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" (UID: "f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef"). InnerVolumeSpecName "kube-api-access-rtnqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.150390 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.150426 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtnqg\" (UniqueName: \"kubernetes.io/projected/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-kube-api-access-rtnqg\") on node \"crc\" DevicePath \"\"" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.198041 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" (UID: "f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.251334 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.572329 4756 generic.go:334] "Generic (PLEG): container finished" podID="f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" containerID="cd77cd6a66eef61deea97c9af620d500ccde4cd5ad2d19782be2fd7cd20fe7c0" exitCode=0 Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.572521 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfdz4" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.572594 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfdz4" event={"ID":"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef","Type":"ContainerDied","Data":"cd77cd6a66eef61deea97c9af620d500ccde4cd5ad2d19782be2fd7cd20fe7c0"} Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.572678 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfdz4" event={"ID":"f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef","Type":"ContainerDied","Data":"bda2db6e1e31125535d8bcae2beccd891c45607eda5abac1dd9726dab6e95ec2"} Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.572737 4756 scope.go:117] "RemoveContainer" containerID="cd77cd6a66eef61deea97c9af620d500ccde4cd5ad2d19782be2fd7cd20fe7c0" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.604990 4756 scope.go:117] "RemoveContainer" containerID="31d2da239da146c06d8baaec2dd6b66d328068fb50bb91662d9ab780e57055bc" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.608596 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zfdz4"] Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.614036 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zfdz4"] Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.628161 4756 scope.go:117] "RemoveContainer" containerID="bf25a50e9bba3ca869feb637c71488856b249f583edae442d6dafe6ec215c800" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.664882 4756 scope.go:117] "RemoveContainer" containerID="cd77cd6a66eef61deea97c9af620d500ccde4cd5ad2d19782be2fd7cd20fe7c0" Mar 18 14:14:51 crc kubenswrapper[4756]: E0318 14:14:51.665646 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd77cd6a66eef61deea97c9af620d500ccde4cd5ad2d19782be2fd7cd20fe7c0\": container with ID starting with cd77cd6a66eef61deea97c9af620d500ccde4cd5ad2d19782be2fd7cd20fe7c0 not found: ID does not exist" containerID="cd77cd6a66eef61deea97c9af620d500ccde4cd5ad2d19782be2fd7cd20fe7c0" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.665690 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd77cd6a66eef61deea97c9af620d500ccde4cd5ad2d19782be2fd7cd20fe7c0"} err="failed to get container status \"cd77cd6a66eef61deea97c9af620d500ccde4cd5ad2d19782be2fd7cd20fe7c0\": rpc error: code = NotFound desc = could not find container \"cd77cd6a66eef61deea97c9af620d500ccde4cd5ad2d19782be2fd7cd20fe7c0\": container with ID starting with cd77cd6a66eef61deea97c9af620d500ccde4cd5ad2d19782be2fd7cd20fe7c0 not found: ID does not exist" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.665715 4756 scope.go:117] "RemoveContainer" containerID="31d2da239da146c06d8baaec2dd6b66d328068fb50bb91662d9ab780e57055bc" Mar 18 14:14:51 crc kubenswrapper[4756]: E0318 14:14:51.666166 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31d2da239da146c06d8baaec2dd6b66d328068fb50bb91662d9ab780e57055bc\": container with ID starting with 31d2da239da146c06d8baaec2dd6b66d328068fb50bb91662d9ab780e57055bc not found: ID does not exist" containerID="31d2da239da146c06d8baaec2dd6b66d328068fb50bb91662d9ab780e57055bc" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.666212 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31d2da239da146c06d8baaec2dd6b66d328068fb50bb91662d9ab780e57055bc"} err="failed to get container status \"31d2da239da146c06d8baaec2dd6b66d328068fb50bb91662d9ab780e57055bc\": rpc error: code = NotFound desc = could not find container \"31d2da239da146c06d8baaec2dd6b66d328068fb50bb91662d9ab780e57055bc\": container with ID starting with 31d2da239da146c06d8baaec2dd6b66d328068fb50bb91662d9ab780e57055bc not found: ID does not exist" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.666238 4756 scope.go:117] "RemoveContainer" containerID="bf25a50e9bba3ca869feb637c71488856b249f583edae442d6dafe6ec215c800" Mar 18 14:14:51 crc kubenswrapper[4756]: E0318 14:14:51.666550 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf25a50e9bba3ca869feb637c71488856b249f583edae442d6dafe6ec215c800\": container with ID starting with bf25a50e9bba3ca869feb637c71488856b249f583edae442d6dafe6ec215c800 not found: ID does not exist" containerID="bf25a50e9bba3ca869feb637c71488856b249f583edae442d6dafe6ec215c800" Mar 18 14:14:51 crc kubenswrapper[4756]: I0318 14:14:51.666584 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf25a50e9bba3ca869feb637c71488856b249f583edae442d6dafe6ec215c800"} err="failed to get container status \"bf25a50e9bba3ca869feb637c71488856b249f583edae442d6dafe6ec215c800\": rpc error: code = NotFound desc = could not find container \"bf25a50e9bba3ca869feb637c71488856b249f583edae442d6dafe6ec215c800\": container with ID starting with bf25a50e9bba3ca869feb637c71488856b249f583edae442d6dafe6ec215c800 not found: ID does not exist" Mar 18 14:14:53 crc kubenswrapper[4756]: I0318 14:14:53.329650 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" path="/var/lib/kubelet/pods/f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef/volumes" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.143114 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s"] Mar 18 14:15:00 crc kubenswrapper[4756]: E0318 14:15:00.143737 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" containerName="extract-utilities" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.143755 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" containerName="extract-utilities" Mar 18 14:15:00 crc kubenswrapper[4756]: E0318 14:15:00.143777 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" containerName="registry-server" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.143788 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" containerName="registry-server" Mar 18 14:15:00 crc kubenswrapper[4756]: E0318 14:15:00.143801 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" containerName="extract-content" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.143814 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" containerName="extract-content" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.143980 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0166b9f-2fc7-4d1c-9b81-8267ddbaaeef" containerName="registry-server" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.144636 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.147460 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.156464 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.156885 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s"] Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.278500 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34dec4ac-ef61-4769-a369-e0f463c78467-config-volume\") pod \"collect-profiles-29564055-bkp2s\" (UID: \"34dec4ac-ef61-4769-a369-e0f463c78467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.279165 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th9q5\" (UniqueName: \"kubernetes.io/projected/34dec4ac-ef61-4769-a369-e0f463c78467-kube-api-access-th9q5\") pod \"collect-profiles-29564055-bkp2s\" (UID: \"34dec4ac-ef61-4769-a369-e0f463c78467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.279351 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34dec4ac-ef61-4769-a369-e0f463c78467-secret-volume\") pod \"collect-profiles-29564055-bkp2s\" (UID: \"34dec4ac-ef61-4769-a369-e0f463c78467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.381147 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th9q5\" (UniqueName: \"kubernetes.io/projected/34dec4ac-ef61-4769-a369-e0f463c78467-kube-api-access-th9q5\") pod \"collect-profiles-29564055-bkp2s\" (UID: \"34dec4ac-ef61-4769-a369-e0f463c78467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.381242 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34dec4ac-ef61-4769-a369-e0f463c78467-secret-volume\") pod \"collect-profiles-29564055-bkp2s\" (UID: \"34dec4ac-ef61-4769-a369-e0f463c78467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.381362 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34dec4ac-ef61-4769-a369-e0f463c78467-config-volume\") pod \"collect-profiles-29564055-bkp2s\" (UID: \"34dec4ac-ef61-4769-a369-e0f463c78467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.382545 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34dec4ac-ef61-4769-a369-e0f463c78467-config-volume\") pod \"collect-profiles-29564055-bkp2s\" (UID: \"34dec4ac-ef61-4769-a369-e0f463c78467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.389253 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34dec4ac-ef61-4769-a369-e0f463c78467-secret-volume\") pod \"collect-profiles-29564055-bkp2s\" (UID: \"34dec4ac-ef61-4769-a369-e0f463c78467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.403922 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th9q5\" (UniqueName: \"kubernetes.io/projected/34dec4ac-ef61-4769-a369-e0f463c78467-kube-api-access-th9q5\") pod \"collect-profiles-29564055-bkp2s\" (UID: \"34dec4ac-ef61-4769-a369-e0f463c78467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.468101 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" Mar 18 14:15:00 crc kubenswrapper[4756]: I0318 14:15:00.714154 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s"] Mar 18 14:15:01 crc kubenswrapper[4756]: I0318 14:15:01.657069 4756 generic.go:334] "Generic (PLEG): container finished" podID="34dec4ac-ef61-4769-a369-e0f463c78467" containerID="73c64e3b6232d7aaa06d468af86b569fa9f44e9ab820edcefac196f11f754c5d" exitCode=0 Mar 18 14:15:01 crc kubenswrapper[4756]: I0318 14:15:01.657224 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" event={"ID":"34dec4ac-ef61-4769-a369-e0f463c78467","Type":"ContainerDied","Data":"73c64e3b6232d7aaa06d468af86b569fa9f44e9ab820edcefac196f11f754c5d"} Mar 18 14:15:01 crc kubenswrapper[4756]: I0318 14:15:01.657632 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" event={"ID":"34dec4ac-ef61-4769-a369-e0f463c78467","Type":"ContainerStarted","Data":"03cf9ab3442e1fa6833dff0501ba10b33f0338f67cae206f73cebdd854ed0c4a"} Mar 18 14:15:02 crc kubenswrapper[4756]: I0318 14:15:02.961857 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" Mar 18 14:15:03 crc kubenswrapper[4756]: I0318 14:15:03.116070 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34dec4ac-ef61-4769-a369-e0f463c78467-secret-volume\") pod \"34dec4ac-ef61-4769-a369-e0f463c78467\" (UID: \"34dec4ac-ef61-4769-a369-e0f463c78467\") " Mar 18 14:15:03 crc kubenswrapper[4756]: I0318 14:15:03.116179 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34dec4ac-ef61-4769-a369-e0f463c78467-config-volume\") pod \"34dec4ac-ef61-4769-a369-e0f463c78467\" (UID: \"34dec4ac-ef61-4769-a369-e0f463c78467\") " Mar 18 14:15:03 crc kubenswrapper[4756]: I0318 14:15:03.116220 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th9q5\" (UniqueName: \"kubernetes.io/projected/34dec4ac-ef61-4769-a369-e0f463c78467-kube-api-access-th9q5\") pod \"34dec4ac-ef61-4769-a369-e0f463c78467\" (UID: \"34dec4ac-ef61-4769-a369-e0f463c78467\") " Mar 18 14:15:03 crc kubenswrapper[4756]: I0318 14:15:03.116723 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34dec4ac-ef61-4769-a369-e0f463c78467-config-volume" (OuterVolumeSpecName: "config-volume") pod "34dec4ac-ef61-4769-a369-e0f463c78467" (UID: "34dec4ac-ef61-4769-a369-e0f463c78467"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:15:03 crc kubenswrapper[4756]: I0318 14:15:03.122696 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34dec4ac-ef61-4769-a369-e0f463c78467-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "34dec4ac-ef61-4769-a369-e0f463c78467" (UID: "34dec4ac-ef61-4769-a369-e0f463c78467"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:15:03 crc kubenswrapper[4756]: I0318 14:15:03.123013 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34dec4ac-ef61-4769-a369-e0f463c78467-kube-api-access-th9q5" (OuterVolumeSpecName: "kube-api-access-th9q5") pod "34dec4ac-ef61-4769-a369-e0f463c78467" (UID: "34dec4ac-ef61-4769-a369-e0f463c78467"). InnerVolumeSpecName "kube-api-access-th9q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:15:03 crc kubenswrapper[4756]: I0318 14:15:03.217587 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34dec4ac-ef61-4769-a369-e0f463c78467-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:03 crc kubenswrapper[4756]: I0318 14:15:03.217636 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34dec4ac-ef61-4769-a369-e0f463c78467-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:03 crc kubenswrapper[4756]: I0318 14:15:03.217657 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th9q5\" (UniqueName: \"kubernetes.io/projected/34dec4ac-ef61-4769-a369-e0f463c78467-kube-api-access-th9q5\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:03 crc kubenswrapper[4756]: I0318 14:15:03.676392 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" event={"ID":"34dec4ac-ef61-4769-a369-e0f463c78467","Type":"ContainerDied","Data":"03cf9ab3442e1fa6833dff0501ba10b33f0338f67cae206f73cebdd854ed0c4a"} Mar 18 14:15:03 crc kubenswrapper[4756]: I0318 14:15:03.676453 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03cf9ab3442e1fa6833dff0501ba10b33f0338f67cae206f73cebdd854ed0c4a" Mar 18 14:15:03 crc kubenswrapper[4756]: I0318 14:15:03.676510 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s" Mar 18 14:15:06 crc kubenswrapper[4756]: I0318 14:15:06.915026 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:15:06 crc kubenswrapper[4756]: I0318 14:15:06.915537 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.395449 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r"] Mar 18 14:15:22 crc kubenswrapper[4756]: E0318 14:15:22.396046 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34dec4ac-ef61-4769-a369-e0f463c78467" containerName="collect-profiles" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.396071 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="34dec4ac-ef61-4769-a369-e0f463c78467" containerName="collect-profiles" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.396205 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="34dec4ac-ef61-4769-a369-e0f463c78467" containerName="collect-profiles" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.397080 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.398982 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.412764 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r"] Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.493704 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0198b01-80fd-4196-b81c-fbe69a187c18-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r\" (UID: \"b0198b01-80fd-4196-b81c-fbe69a187c18\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.493774 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42mdj\" (UniqueName: \"kubernetes.io/projected/b0198b01-80fd-4196-b81c-fbe69a187c18-kube-api-access-42mdj\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r\" (UID: \"b0198b01-80fd-4196-b81c-fbe69a187c18\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.493808 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0198b01-80fd-4196-b81c-fbe69a187c18-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r\" (UID: \"b0198b01-80fd-4196-b81c-fbe69a187c18\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.595181 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0198b01-80fd-4196-b81c-fbe69a187c18-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r\" (UID: \"b0198b01-80fd-4196-b81c-fbe69a187c18\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.595220 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42mdj\" (UniqueName: \"kubernetes.io/projected/b0198b01-80fd-4196-b81c-fbe69a187c18-kube-api-access-42mdj\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r\" (UID: \"b0198b01-80fd-4196-b81c-fbe69a187c18\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.595252 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0198b01-80fd-4196-b81c-fbe69a187c18-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r\" (UID: \"b0198b01-80fd-4196-b81c-fbe69a187c18\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.595733 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0198b01-80fd-4196-b81c-fbe69a187c18-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r\" (UID: \"b0198b01-80fd-4196-b81c-fbe69a187c18\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.595747 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0198b01-80fd-4196-b81c-fbe69a187c18-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r\" (UID: \"b0198b01-80fd-4196-b81c-fbe69a187c18\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.623293 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42mdj\" (UniqueName: \"kubernetes.io/projected/b0198b01-80fd-4196-b81c-fbe69a187c18-kube-api-access-42mdj\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r\" (UID: \"b0198b01-80fd-4196-b81c-fbe69a187c18\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.718218 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" Mar 18 14:15:22 crc kubenswrapper[4756]: I0318 14:15:22.938015 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r"] Mar 18 14:15:23 crc kubenswrapper[4756]: I0318 14:15:23.879855 4756 generic.go:334] "Generic (PLEG): container finished" podID="b0198b01-80fd-4196-b81c-fbe69a187c18" containerID="bc636ba13ddb34c8fe4ef539ebaffe99c34f950c0d9b3b8e7d69c263700a62dc" exitCode=0 Mar 18 14:15:23 crc kubenswrapper[4756]: I0318 14:15:23.879898 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" event={"ID":"b0198b01-80fd-4196-b81c-fbe69a187c18","Type":"ContainerDied","Data":"bc636ba13ddb34c8fe4ef539ebaffe99c34f950c0d9b3b8e7d69c263700a62dc"} Mar 18 14:15:23 crc kubenswrapper[4756]: I0318 14:15:23.879921 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" event={"ID":"b0198b01-80fd-4196-b81c-fbe69a187c18","Type":"ContainerStarted","Data":"519ba709c756aef2c1df6b679ca89058abe1b2416034f010d2292599b933ec22"} Mar 18 14:15:23 crc kubenswrapper[4756]: I0318 14:15:23.881583 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:15:25 crc kubenswrapper[4756]: I0318 14:15:25.898936 4756 generic.go:334] "Generic (PLEG): container finished" podID="b0198b01-80fd-4196-b81c-fbe69a187c18" containerID="b290636ac0f358d35f88ca9be7003a579971225040218e11d38c0df5b39095e4" exitCode=0 Mar 18 14:15:25 crc kubenswrapper[4756]: I0318 14:15:25.899060 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" event={"ID":"b0198b01-80fd-4196-b81c-fbe69a187c18","Type":"ContainerDied","Data":"b290636ac0f358d35f88ca9be7003a579971225040218e11d38c0df5b39095e4"} Mar 18 14:15:26 crc kubenswrapper[4756]: I0318 14:15:26.909721 4756 generic.go:334] "Generic (PLEG): container finished" podID="b0198b01-80fd-4196-b81c-fbe69a187c18" containerID="7179c6f1b318cd8fdb4d7fa1c61e17b3e3d61ff8e7316b791aeb9916e5be5b09" exitCode=0 Mar 18 14:15:26 crc kubenswrapper[4756]: I0318 14:15:26.909817 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" event={"ID":"b0198b01-80fd-4196-b81c-fbe69a187c18","Type":"ContainerDied","Data":"7179c6f1b318cd8fdb4d7fa1c61e17b3e3d61ff8e7316b791aeb9916e5be5b09"} Mar 18 14:15:28 crc kubenswrapper[4756]: I0318 14:15:28.177298 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" Mar 18 14:15:28 crc kubenswrapper[4756]: I0318 14:15:28.273874 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0198b01-80fd-4196-b81c-fbe69a187c18-util\") pod \"b0198b01-80fd-4196-b81c-fbe69a187c18\" (UID: \"b0198b01-80fd-4196-b81c-fbe69a187c18\") " Mar 18 14:15:28 crc kubenswrapper[4756]: I0318 14:15:28.273946 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0198b01-80fd-4196-b81c-fbe69a187c18-bundle\") pod \"b0198b01-80fd-4196-b81c-fbe69a187c18\" (UID: \"b0198b01-80fd-4196-b81c-fbe69a187c18\") " Mar 18 14:15:28 crc kubenswrapper[4756]: I0318 14:15:28.274048 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42mdj\" (UniqueName: \"kubernetes.io/projected/b0198b01-80fd-4196-b81c-fbe69a187c18-kube-api-access-42mdj\") pod \"b0198b01-80fd-4196-b81c-fbe69a187c18\" (UID: \"b0198b01-80fd-4196-b81c-fbe69a187c18\") " Mar 18 14:15:28 crc kubenswrapper[4756]: I0318 14:15:28.274501 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0198b01-80fd-4196-b81c-fbe69a187c18-bundle" (OuterVolumeSpecName: "bundle") pod "b0198b01-80fd-4196-b81c-fbe69a187c18" (UID: "b0198b01-80fd-4196-b81c-fbe69a187c18"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:15:28 crc kubenswrapper[4756]: I0318 14:15:28.279441 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0198b01-80fd-4196-b81c-fbe69a187c18-kube-api-access-42mdj" (OuterVolumeSpecName: "kube-api-access-42mdj") pod "b0198b01-80fd-4196-b81c-fbe69a187c18" (UID: "b0198b01-80fd-4196-b81c-fbe69a187c18"). InnerVolumeSpecName "kube-api-access-42mdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:15:28 crc kubenswrapper[4756]: I0318 14:15:28.376017 4756 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0198b01-80fd-4196-b81c-fbe69a187c18-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:28 crc kubenswrapper[4756]: I0318 14:15:28.376064 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42mdj\" (UniqueName: \"kubernetes.io/projected/b0198b01-80fd-4196-b81c-fbe69a187c18-kube-api-access-42mdj\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:28 crc kubenswrapper[4756]: I0318 14:15:28.605365 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0198b01-80fd-4196-b81c-fbe69a187c18-util" (OuterVolumeSpecName: "util") pod "b0198b01-80fd-4196-b81c-fbe69a187c18" (UID: "b0198b01-80fd-4196-b81c-fbe69a187c18"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:15:28 crc kubenswrapper[4756]: I0318 14:15:28.680579 4756 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0198b01-80fd-4196-b81c-fbe69a187c18-util\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:28 crc kubenswrapper[4756]: I0318 14:15:28.925365 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" event={"ID":"b0198b01-80fd-4196-b81c-fbe69a187c18","Type":"ContainerDied","Data":"519ba709c756aef2c1df6b679ca89058abe1b2416034f010d2292599b933ec22"} Mar 18 14:15:28 crc kubenswrapper[4756]: I0318 14:15:28.925431 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="519ba709c756aef2c1df6b679ca89058abe1b2416034f010d2292599b933ec22" Mar 18 14:15:28 crc kubenswrapper[4756]: I0318 14:15:28.925519 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r" Mar 18 14:15:31 crc kubenswrapper[4756]: I0318 14:15:31.824880 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-bjjcv"] Mar 18 14:15:31 crc kubenswrapper[4756]: E0318 14:15:31.825680 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0198b01-80fd-4196-b81c-fbe69a187c18" containerName="util" Mar 18 14:15:31 crc kubenswrapper[4756]: I0318 14:15:31.825703 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0198b01-80fd-4196-b81c-fbe69a187c18" containerName="util" Mar 18 14:15:31 crc kubenswrapper[4756]: E0318 14:15:31.825728 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0198b01-80fd-4196-b81c-fbe69a187c18" containerName="extract" Mar 18 14:15:31 crc kubenswrapper[4756]: I0318 14:15:31.825742 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0198b01-80fd-4196-b81c-fbe69a187c18" containerName="extract" Mar 18 14:15:31 crc kubenswrapper[4756]: E0318 14:15:31.825755 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0198b01-80fd-4196-b81c-fbe69a187c18" containerName="pull" Mar 18 14:15:31 crc kubenswrapper[4756]: I0318 14:15:31.825767 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0198b01-80fd-4196-b81c-fbe69a187c18" containerName="pull" Mar 18 14:15:31 crc kubenswrapper[4756]: I0318 14:15:31.825971 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0198b01-80fd-4196-b81c-fbe69a187c18" containerName="extract" Mar 18 14:15:31 crc kubenswrapper[4756]: I0318 14:15:31.826672 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-bjjcv" Mar 18 14:15:31 crc kubenswrapper[4756]: I0318 14:15:31.828911 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 14:15:31 crc kubenswrapper[4756]: I0318 14:15:31.829243 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 14:15:31 crc kubenswrapper[4756]: I0318 14:15:31.829599 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-cv6cs" Mar 18 14:15:31 crc kubenswrapper[4756]: I0318 14:15:31.841688 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-bjjcv"] Mar 18 14:15:31 crc kubenswrapper[4756]: I0318 14:15:31.923675 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnwcx\" (UniqueName: \"kubernetes.io/projected/bc5a9dab-2f6a-456b-83de-eb45d03c4062-kube-api-access-xnwcx\") pod \"nmstate-operator-796d4cfff4-bjjcv\" (UID: \"bc5a9dab-2f6a-456b-83de-eb45d03c4062\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-bjjcv" Mar 18 14:15:32 crc kubenswrapper[4756]: I0318 14:15:32.025249 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnwcx\" (UniqueName: \"kubernetes.io/projected/bc5a9dab-2f6a-456b-83de-eb45d03c4062-kube-api-access-xnwcx\") pod \"nmstate-operator-796d4cfff4-bjjcv\" (UID: \"bc5a9dab-2f6a-456b-83de-eb45d03c4062\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-bjjcv" Mar 18 14:15:32 crc kubenswrapper[4756]: I0318 14:15:32.047754 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnwcx\" (UniqueName: \"kubernetes.io/projected/bc5a9dab-2f6a-456b-83de-eb45d03c4062-kube-api-access-xnwcx\") pod \"nmstate-operator-796d4cfff4-bjjcv\" (UID: \"bc5a9dab-2f6a-456b-83de-eb45d03c4062\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-bjjcv" Mar 18 14:15:32 crc kubenswrapper[4756]: I0318 14:15:32.151624 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-bjjcv" Mar 18 14:15:32 crc kubenswrapper[4756]: I0318 14:15:32.377482 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-bjjcv"] Mar 18 14:15:32 crc kubenswrapper[4756]: I0318 14:15:32.952294 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-bjjcv" event={"ID":"bc5a9dab-2f6a-456b-83de-eb45d03c4062","Type":"ContainerStarted","Data":"394bc4ecfd5cea3be3f734a31b9ada2fc48b9a8937981d137b0d1256f055981b"} Mar 18 14:15:34 crc kubenswrapper[4756]: I0318 14:15:34.970303 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-bjjcv" event={"ID":"bc5a9dab-2f6a-456b-83de-eb45d03c4062","Type":"ContainerStarted","Data":"d9e182531539cefd387c00ab55a67904f0b0209808a9b7fb4c25fe0612311921"} Mar 18 14:15:34 crc kubenswrapper[4756]: I0318 14:15:34.991969 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-bjjcv" podStartSLOduration=1.8911561780000001 podStartE2EDuration="3.991947455s" podCreationTimestamp="2026-03-18 14:15:31 +0000 UTC" firstStartedPulling="2026-03-18 14:15:32.380966569 +0000 UTC m=+933.695384544" lastFinishedPulling="2026-03-18 14:15:34.481757846 +0000 UTC m=+935.796175821" observedRunningTime="2026-03-18 14:15:34.990518455 +0000 UTC m=+936.304936430" watchObservedRunningTime="2026-03-18 14:15:34.991947455 +0000 UTC m=+936.306365440" Mar 18 14:15:35 crc kubenswrapper[4756]: I0318 14:15:35.974740 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-895gl"] Mar 18 14:15:35 crc kubenswrapper[4756]: I0318 14:15:35.975588 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-895gl" Mar 18 14:15:35 crc kubenswrapper[4756]: I0318 14:15:35.980463 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-qlz86" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.007434 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-895gl"] Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.008005 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-65v9t"] Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.009333 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-65v9t" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.011920 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.032895 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-qd4sc"] Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.033731 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.054186 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-65v9t"] Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.079889 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz7hl\" (UniqueName: \"kubernetes.io/projected/c81cea89-3315-4647-8596-e0132b8dd763-kube-api-access-cz7hl\") pod \"nmstate-metrics-9b8c8685d-895gl\" (UID: \"c81cea89-3315-4647-8596-e0132b8dd763\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-895gl" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.139174 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88"] Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.139855 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.142087 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.143292 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.143593 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-wfpvh" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.150521 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88"] Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.180503 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f175bb68-1110-4701-b4a1-9eb04330fdb2-nmstate-lock\") pod \"nmstate-handler-qd4sc\" (UID: \"f175bb68-1110-4701-b4a1-9eb04330fdb2\") " pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.180753 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/593a9109-9e37-47a7-b467-de7be3502ba4-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-65v9t\" (UID: \"593a9109-9e37-47a7-b467-de7be3502ba4\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-65v9t" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.180869 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f175bb68-1110-4701-b4a1-9eb04330fdb2-ovs-socket\") pod \"nmstate-handler-qd4sc\" (UID: \"f175bb68-1110-4701-b4a1-9eb04330fdb2\") " pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.180948 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz7hl\" (UniqueName: \"kubernetes.io/projected/c81cea89-3315-4647-8596-e0132b8dd763-kube-api-access-cz7hl\") pod \"nmstate-metrics-9b8c8685d-895gl\" (UID: \"c81cea89-3315-4647-8596-e0132b8dd763\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-895gl" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.181039 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr9bp\" (UniqueName: \"kubernetes.io/projected/593a9109-9e37-47a7-b467-de7be3502ba4-kube-api-access-mr9bp\") pod \"nmstate-webhook-5f558f5558-65v9t\" (UID: \"593a9109-9e37-47a7-b467-de7be3502ba4\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-65v9t" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.181136 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmqbf\" (UniqueName: \"kubernetes.io/projected/f175bb68-1110-4701-b4a1-9eb04330fdb2-kube-api-access-nmqbf\") pod \"nmstate-handler-qd4sc\" (UID: \"f175bb68-1110-4701-b4a1-9eb04330fdb2\") " pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.181234 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f175bb68-1110-4701-b4a1-9eb04330fdb2-dbus-socket\") pod \"nmstate-handler-qd4sc\" (UID: \"f175bb68-1110-4701-b4a1-9eb04330fdb2\") " pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.198957 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz7hl\" (UniqueName: \"kubernetes.io/projected/c81cea89-3315-4647-8596-e0132b8dd763-kube-api-access-cz7hl\") pod \"nmstate-metrics-9b8c8685d-895gl\" (UID: \"c81cea89-3315-4647-8596-e0132b8dd763\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-895gl" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.282561 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f175bb68-1110-4701-b4a1-9eb04330fdb2-nmstate-lock\") pod \"nmstate-handler-qd4sc\" (UID: \"f175bb68-1110-4701-b4a1-9eb04330fdb2\") " pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.282868 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/593a9109-9e37-47a7-b467-de7be3502ba4-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-65v9t\" (UID: \"593a9109-9e37-47a7-b467-de7be3502ba4\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-65v9t" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.282907 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f175bb68-1110-4701-b4a1-9eb04330fdb2-ovs-socket\") pod \"nmstate-handler-qd4sc\" (UID: \"f175bb68-1110-4701-b4a1-9eb04330fdb2\") " pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.282654 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f175bb68-1110-4701-b4a1-9eb04330fdb2-nmstate-lock\") pod \"nmstate-handler-qd4sc\" (UID: \"f175bb68-1110-4701-b4a1-9eb04330fdb2\") " pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.282928 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2bj9\" (UniqueName: \"kubernetes.io/projected/a4975a7f-ddbb-46e3-91be-b1a7757abced-kube-api-access-g2bj9\") pod \"nmstate-console-plugin-86f58fcf4-5gg88\" (UID: \"a4975a7f-ddbb-46e3-91be-b1a7757abced\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.283018 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f175bb68-1110-4701-b4a1-9eb04330fdb2-ovs-socket\") pod \"nmstate-handler-qd4sc\" (UID: \"f175bb68-1110-4701-b4a1-9eb04330fdb2\") " pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.283083 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr9bp\" (UniqueName: \"kubernetes.io/projected/593a9109-9e37-47a7-b467-de7be3502ba4-kube-api-access-mr9bp\") pod \"nmstate-webhook-5f558f5558-65v9t\" (UID: \"593a9109-9e37-47a7-b467-de7be3502ba4\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-65v9t" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.283104 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmqbf\" (UniqueName: \"kubernetes.io/projected/f175bb68-1110-4701-b4a1-9eb04330fdb2-kube-api-access-nmqbf\") pod \"nmstate-handler-qd4sc\" (UID: \"f175bb68-1110-4701-b4a1-9eb04330fdb2\") " pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.283159 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a4975a7f-ddbb-46e3-91be-b1a7757abced-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5gg88\" (UID: \"a4975a7f-ddbb-46e3-91be-b1a7757abced\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.283219 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f175bb68-1110-4701-b4a1-9eb04330fdb2-dbus-socket\") pod \"nmstate-handler-qd4sc\" (UID: \"f175bb68-1110-4701-b4a1-9eb04330fdb2\") " pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.283249 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4975a7f-ddbb-46e3-91be-b1a7757abced-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5gg88\" (UID: \"a4975a7f-ddbb-46e3-91be-b1a7757abced\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.283630 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f175bb68-1110-4701-b4a1-9eb04330fdb2-dbus-socket\") pod \"nmstate-handler-qd4sc\" (UID: \"f175bb68-1110-4701-b4a1-9eb04330fdb2\") " pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.292348 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-895gl" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.298693 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/593a9109-9e37-47a7-b467-de7be3502ba4-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-65v9t\" (UID: \"593a9109-9e37-47a7-b467-de7be3502ba4\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-65v9t" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.306403 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmqbf\" (UniqueName: \"kubernetes.io/projected/f175bb68-1110-4701-b4a1-9eb04330fdb2-kube-api-access-nmqbf\") pod \"nmstate-handler-qd4sc\" (UID: \"f175bb68-1110-4701-b4a1-9eb04330fdb2\") " pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.306924 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr9bp\" (UniqueName: \"kubernetes.io/projected/593a9109-9e37-47a7-b467-de7be3502ba4-kube-api-access-mr9bp\") pod \"nmstate-webhook-5f558f5558-65v9t\" (UID: \"593a9109-9e37-47a7-b467-de7be3502ba4\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-65v9t" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.338711 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7fd7dbd5d8-jvfqq"] Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.339418 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.350455 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-65v9t" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.354779 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fd7dbd5d8-jvfqq"] Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.363523 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.384395 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a4975a7f-ddbb-46e3-91be-b1a7757abced-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5gg88\" (UID: \"a4975a7f-ddbb-46e3-91be-b1a7757abced\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.384446 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4975a7f-ddbb-46e3-91be-b1a7757abced-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5gg88\" (UID: \"a4975a7f-ddbb-46e3-91be-b1a7757abced\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.384499 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2bj9\" (UniqueName: \"kubernetes.io/projected/a4975a7f-ddbb-46e3-91be-b1a7757abced-kube-api-access-g2bj9\") pod \"nmstate-console-plugin-86f58fcf4-5gg88\" (UID: \"a4975a7f-ddbb-46e3-91be-b1a7757abced\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.385578 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a4975a7f-ddbb-46e3-91be-b1a7757abced-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5gg88\" (UID: \"a4975a7f-ddbb-46e3-91be-b1a7757abced\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.388875 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4975a7f-ddbb-46e3-91be-b1a7757abced-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5gg88\" (UID: \"a4975a7f-ddbb-46e3-91be-b1a7757abced\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.403618 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2bj9\" (UniqueName: \"kubernetes.io/projected/a4975a7f-ddbb-46e3-91be-b1a7757abced-kube-api-access-g2bj9\") pod \"nmstate-console-plugin-86f58fcf4-5gg88\" (UID: \"a4975a7f-ddbb-46e3-91be-b1a7757abced\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" Mar 18 14:15:36 crc kubenswrapper[4756]: W0318 14:15:36.430144 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf175bb68_1110_4701_b4a1_9eb04330fdb2.slice/crio-289b725d41a23be0b23c7b64e36b089cd1da9411153f703ef00abe8f3e7303a4 WatchSource:0}: Error finding container 289b725d41a23be0b23c7b64e36b089cd1da9411153f703ef00abe8f3e7303a4: Status 404 returned error can't find the container with id 289b725d41a23be0b23c7b64e36b089cd1da9411153f703ef00abe8f3e7303a4 Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.455082 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.485683 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab0cd268-e134-4185-9816-f46bf0c46a44-console-oauth-config\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.485852 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab0cd268-e134-4185-9816-f46bf0c46a44-console-serving-cert\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.485897 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab0cd268-e134-4185-9816-f46bf0c46a44-console-config\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.485941 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6jjm\" (UniqueName: \"kubernetes.io/projected/ab0cd268-e134-4185-9816-f46bf0c46a44-kube-api-access-v6jjm\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.485965 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab0cd268-e134-4185-9816-f46bf0c46a44-service-ca\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.485993 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab0cd268-e134-4185-9816-f46bf0c46a44-oauth-serving-cert\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.486027 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab0cd268-e134-4185-9816-f46bf0c46a44-trusted-ca-bundle\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.576924 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-65v9t"] Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.586531 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab0cd268-e134-4185-9816-f46bf0c46a44-console-oauth-config\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.586560 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab0cd268-e134-4185-9816-f46bf0c46a44-console-serving-cert\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.586579 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab0cd268-e134-4185-9816-f46bf0c46a44-console-config\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.586612 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6jjm\" (UniqueName: \"kubernetes.io/projected/ab0cd268-e134-4185-9816-f46bf0c46a44-kube-api-access-v6jjm\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.586633 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab0cd268-e134-4185-9816-f46bf0c46a44-service-ca\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.586653 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab0cd268-e134-4185-9816-f46bf0c46a44-oauth-serving-cert\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.586685 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab0cd268-e134-4185-9816-f46bf0c46a44-trusted-ca-bundle\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.590202 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab0cd268-e134-4185-9816-f46bf0c46a44-oauth-serving-cert\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.590435 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab0cd268-e134-4185-9816-f46bf0c46a44-console-config\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.591299 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab0cd268-e134-4185-9816-f46bf0c46a44-trusted-ca-bundle\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.592328 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab0cd268-e134-4185-9816-f46bf0c46a44-service-ca\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.593715 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab0cd268-e134-4185-9816-f46bf0c46a44-console-serving-cert\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.594556 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab0cd268-e134-4185-9816-f46bf0c46a44-console-oauth-config\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.602328 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6jjm\" (UniqueName: \"kubernetes.io/projected/ab0cd268-e134-4185-9816-f46bf0c46a44-kube-api-access-v6jjm\") pod \"console-7fd7dbd5d8-jvfqq\" (UID: \"ab0cd268-e134-4185-9816-f46bf0c46a44\") " pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.656639 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88"] Mar 18 14:15:36 crc kubenswrapper[4756]: W0318 14:15:36.659480 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4975a7f_ddbb_46e3_91be_b1a7757abced.slice/crio-69a4ed0a1fc35042d286da4ba45e90f24def94c91f59ee9afb5523b2b987afc2 WatchSource:0}: Error finding container 69a4ed0a1fc35042d286da4ba45e90f24def94c91f59ee9afb5523b2b987afc2: Status 404 returned error can't find the container with id 69a4ed0a1fc35042d286da4ba45e90f24def94c91f59ee9afb5523b2b987afc2 Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.700860 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.719847 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-895gl"] Mar 18 14:15:36 crc kubenswrapper[4756]: W0318 14:15:36.721190 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc81cea89_3315_4647_8596_e0132b8dd763.slice/crio-4aefdc5b92d5627d618a6fd73bdcadd34cc5315d662e0415b046e44e60aa58da WatchSource:0}: Error finding container 4aefdc5b92d5627d618a6fd73bdcadd34cc5315d662e0415b046e44e60aa58da: Status 404 returned error can't find the container with id 4aefdc5b92d5627d618a6fd73bdcadd34cc5315d662e0415b046e44e60aa58da Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.915132 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fd7dbd5d8-jvfqq"] Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.915432 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.915486 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.915530 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.916180 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"617eebb4a8c3d04af231bb44e996daa1896f056ada27eee9b25a69c05455bb74"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.916242 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://617eebb4a8c3d04af231bb44e996daa1896f056ada27eee9b25a69c05455bb74" gracePeriod=600 Mar 18 14:15:36 crc kubenswrapper[4756]: W0318 14:15:36.918702 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab0cd268_e134_4185_9816_f46bf0c46a44.slice/crio-18f059842625b14f5744a835a8a9860efd305a8c1d7fd7b002dd740e5925ba3d WatchSource:0}: Error finding container 18f059842625b14f5744a835a8a9860efd305a8c1d7fd7b002dd740e5925ba3d: Status 404 returned error can't find the container with id 18f059842625b14f5744a835a8a9860efd305a8c1d7fd7b002dd740e5925ba3d Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.981471 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-65v9t" event={"ID":"593a9109-9e37-47a7-b467-de7be3502ba4","Type":"ContainerStarted","Data":"b794f44674e2b59a8d0980ba6ba4ca9f5a5024500f8a57c103a37d964e8bdca0"} Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.984110 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" event={"ID":"a4975a7f-ddbb-46e3-91be-b1a7757abced","Type":"ContainerStarted","Data":"69a4ed0a1fc35042d286da4ba45e90f24def94c91f59ee9afb5523b2b987afc2"} Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.985950 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fd7dbd5d8-jvfqq" event={"ID":"ab0cd268-e134-4185-9816-f46bf0c46a44","Type":"ContainerStarted","Data":"18f059842625b14f5744a835a8a9860efd305a8c1d7fd7b002dd740e5925ba3d"} Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.987295 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qd4sc" event={"ID":"f175bb68-1110-4701-b4a1-9eb04330fdb2","Type":"ContainerStarted","Data":"289b725d41a23be0b23c7b64e36b089cd1da9411153f703ef00abe8f3e7303a4"} Mar 18 14:15:36 crc kubenswrapper[4756]: I0318 14:15:36.989635 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-895gl" event={"ID":"c81cea89-3315-4647-8596-e0132b8dd763","Type":"ContainerStarted","Data":"4aefdc5b92d5627d618a6fd73bdcadd34cc5315d662e0415b046e44e60aa58da"} Mar 18 14:15:38 crc kubenswrapper[4756]: I0318 14:15:38.022616 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="617eebb4a8c3d04af231bb44e996daa1896f056ada27eee9b25a69c05455bb74" exitCode=0 Mar 18 14:15:38 crc kubenswrapper[4756]: I0318 14:15:38.023244 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"617eebb4a8c3d04af231bb44e996daa1896f056ada27eee9b25a69c05455bb74"} Mar 18 14:15:38 crc kubenswrapper[4756]: I0318 14:15:38.024154 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"cdecc63cb2f22e85e1b8370b385518ac960137ad6026c38922b0bcf6a6e1374a"} Mar 18 14:15:38 crc kubenswrapper[4756]: I0318 14:15:38.024185 4756 scope.go:117] "RemoveContainer" containerID="5720abaa535d38bd7f462e46e5802b411c62642cbf7053424674cf7b459cf96b" Mar 18 14:15:38 crc kubenswrapper[4756]: I0318 14:15:38.034850 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fd7dbd5d8-jvfqq" event={"ID":"ab0cd268-e134-4185-9816-f46bf0c46a44","Type":"ContainerStarted","Data":"518c3e91ba68cd0848682de87b7ff0f6efea17119c5e4c322d7a7d0e7d30ba99"} Mar 18 14:15:38 crc kubenswrapper[4756]: I0318 14:15:38.071532 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7fd7dbd5d8-jvfqq" podStartSLOduration=2.071513833 podStartE2EDuration="2.071513833s" podCreationTimestamp="2026-03-18 14:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:15:38.071269496 +0000 UTC m=+939.385687491" watchObservedRunningTime="2026-03-18 14:15:38.071513833 +0000 UTC m=+939.385931808" Mar 18 14:15:40 crc kubenswrapper[4756]: I0318 14:15:40.050741 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-65v9t" event={"ID":"593a9109-9e37-47a7-b467-de7be3502ba4","Type":"ContainerStarted","Data":"65a06f3ca45bee45f6d3f999441c13c8560b7cfebb3ba25865fe10f7618041df"} Mar 18 14:15:40 crc kubenswrapper[4756]: I0318 14:15:40.051356 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-65v9t" Mar 18 14:15:40 crc kubenswrapper[4756]: I0318 14:15:40.052045 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" event={"ID":"a4975a7f-ddbb-46e3-91be-b1a7757abced","Type":"ContainerStarted","Data":"dc728ea5463b45e443b85d58bf94a3e947b1120713b625053c3ae6708789032a"} Mar 18 14:15:40 crc kubenswrapper[4756]: I0318 14:15:40.053165 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qd4sc" event={"ID":"f175bb68-1110-4701-b4a1-9eb04330fdb2","Type":"ContainerStarted","Data":"e378fb2eefcb63e0f95864b2258a70e415faa13bd90b85879ae20de2d83703f5"} Mar 18 14:15:40 crc kubenswrapper[4756]: I0318 14:15:40.053230 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:40 crc kubenswrapper[4756]: I0318 14:15:40.054763 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-895gl" event={"ID":"c81cea89-3315-4647-8596-e0132b8dd763","Type":"ContainerStarted","Data":"10dae997d0617c0f73fdde3093f7377c701746981e1903a869bb02faa12a2be7"} Mar 18 14:15:40 crc kubenswrapper[4756]: I0318 14:15:40.067974 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-65v9t" podStartSLOduration=1.931757025 podStartE2EDuration="5.067953034s" podCreationTimestamp="2026-03-18 14:15:35 +0000 UTC" firstStartedPulling="2026-03-18 14:15:36.585383853 +0000 UTC m=+937.899801828" lastFinishedPulling="2026-03-18 14:15:39.721579862 +0000 UTC m=+941.035997837" observedRunningTime="2026-03-18 14:15:40.064875808 +0000 UTC m=+941.379293783" watchObservedRunningTime="2026-03-18 14:15:40.067953034 +0000 UTC m=+941.382371009" Mar 18 14:15:40 crc kubenswrapper[4756]: I0318 14:15:40.084190 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5gg88" podStartSLOduration=1.025874349 podStartE2EDuration="4.08417444s" podCreationTimestamp="2026-03-18 14:15:36 +0000 UTC" firstStartedPulling="2026-03-18 14:15:36.661498682 +0000 UTC m=+937.975916657" lastFinishedPulling="2026-03-18 14:15:39.719798753 +0000 UTC m=+941.034216748" observedRunningTime="2026-03-18 14:15:40.081361123 +0000 UTC m=+941.395779098" watchObservedRunningTime="2026-03-18 14:15:40.08417444 +0000 UTC m=+941.398592415" Mar 18 14:15:40 crc kubenswrapper[4756]: I0318 14:15:40.100058 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-qd4sc" podStartSLOduration=0.808905576 podStartE2EDuration="4.100039038s" podCreationTimestamp="2026-03-18 14:15:36 +0000 UTC" firstStartedPulling="2026-03-18 14:15:36.43260366 +0000 UTC m=+937.747021635" lastFinishedPulling="2026-03-18 14:15:39.723737092 +0000 UTC m=+941.038155097" observedRunningTime="2026-03-18 14:15:40.096260854 +0000 UTC m=+941.410678829" watchObservedRunningTime="2026-03-18 14:15:40.100039038 +0000 UTC m=+941.414457013" Mar 18 14:15:43 crc kubenswrapper[4756]: I0318 14:15:43.078518 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-895gl" event={"ID":"c81cea89-3315-4647-8596-e0132b8dd763","Type":"ContainerStarted","Data":"62a2bf4b34ba7913226cc07c84efab5bbcfbe3978ca921a94b6d2b114337cf96"} Mar 18 14:15:43 crc kubenswrapper[4756]: I0318 14:15:43.104174 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-895gl" podStartSLOduration=2.60757596 podStartE2EDuration="8.104157935s" podCreationTimestamp="2026-03-18 14:15:35 +0000 UTC" firstStartedPulling="2026-03-18 14:15:36.723451811 +0000 UTC m=+938.037869796" lastFinishedPulling="2026-03-18 14:15:42.220033796 +0000 UTC m=+943.534451771" observedRunningTime="2026-03-18 14:15:43.100026411 +0000 UTC m=+944.414444396" watchObservedRunningTime="2026-03-18 14:15:43.104157935 +0000 UTC m=+944.418575910" Mar 18 14:15:46 crc kubenswrapper[4756]: I0318 14:15:46.398076 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-qd4sc" Mar 18 14:15:46 crc kubenswrapper[4756]: I0318 14:15:46.702082 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:46 crc kubenswrapper[4756]: I0318 14:15:46.702947 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:46 crc kubenswrapper[4756]: I0318 14:15:46.709879 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:47 crc kubenswrapper[4756]: I0318 14:15:47.119681 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7fd7dbd5d8-jvfqq" Mar 18 14:15:47 crc kubenswrapper[4756]: I0318 14:15:47.196774 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-k5xg9"] Mar 18 14:15:56 crc kubenswrapper[4756]: I0318 14:15:56.361809 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-65v9t" Mar 18 14:16:00 crc kubenswrapper[4756]: I0318 14:16:00.144803 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564056-4x5qk"] Mar 18 14:16:00 crc kubenswrapper[4756]: I0318 14:16:00.146176 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564056-4x5qk" Mar 18 14:16:00 crc kubenswrapper[4756]: I0318 14:16:00.150226 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:16:00 crc kubenswrapper[4756]: I0318 14:16:00.150278 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:16:00 crc kubenswrapper[4756]: I0318 14:16:00.150464 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:16:00 crc kubenswrapper[4756]: I0318 14:16:00.163037 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564056-4x5qk"] Mar 18 14:16:00 crc kubenswrapper[4756]: I0318 14:16:00.266085 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99m44\" (UniqueName: \"kubernetes.io/projected/c833b21a-3aef-4e7e-9cf7-39e675c262ab-kube-api-access-99m44\") pod \"auto-csr-approver-29564056-4x5qk\" (UID: \"c833b21a-3aef-4e7e-9cf7-39e675c262ab\") " pod="openshift-infra/auto-csr-approver-29564056-4x5qk" Mar 18 14:16:00 crc kubenswrapper[4756]: I0318 14:16:00.367591 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99m44\" (UniqueName: \"kubernetes.io/projected/c833b21a-3aef-4e7e-9cf7-39e675c262ab-kube-api-access-99m44\") pod \"auto-csr-approver-29564056-4x5qk\" (UID: \"c833b21a-3aef-4e7e-9cf7-39e675c262ab\") " pod="openshift-infra/auto-csr-approver-29564056-4x5qk" Mar 18 14:16:00 crc kubenswrapper[4756]: I0318 14:16:00.395158 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99m44\" (UniqueName: \"kubernetes.io/projected/c833b21a-3aef-4e7e-9cf7-39e675c262ab-kube-api-access-99m44\") pod \"auto-csr-approver-29564056-4x5qk\" (UID: \"c833b21a-3aef-4e7e-9cf7-39e675c262ab\") " pod="openshift-infra/auto-csr-approver-29564056-4x5qk" Mar 18 14:16:00 crc kubenswrapper[4756]: I0318 14:16:00.463464 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564056-4x5qk" Mar 18 14:16:00 crc kubenswrapper[4756]: I0318 14:16:00.944811 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564056-4x5qk"] Mar 18 14:16:01 crc kubenswrapper[4756]: I0318 14:16:01.260138 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564056-4x5qk" event={"ID":"c833b21a-3aef-4e7e-9cf7-39e675c262ab","Type":"ContainerStarted","Data":"90e59f713721c65f8119895e4321b42e1bf7aa3be63b8c12e70baf26cfea8d0a"} Mar 18 14:16:03 crc kubenswrapper[4756]: I0318 14:16:03.277593 4756 generic.go:334] "Generic (PLEG): container finished" podID="c833b21a-3aef-4e7e-9cf7-39e675c262ab" containerID="822ec18092ea709c0b867cb7ba08e5d135047c4f3f3353b97c7b082f6c6eeea7" exitCode=0 Mar 18 14:16:03 crc kubenswrapper[4756]: I0318 14:16:03.277898 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564056-4x5qk" event={"ID":"c833b21a-3aef-4e7e-9cf7-39e675c262ab","Type":"ContainerDied","Data":"822ec18092ea709c0b867cb7ba08e5d135047c4f3f3353b97c7b082f6c6eeea7"} Mar 18 14:16:04 crc kubenswrapper[4756]: I0318 14:16:04.653340 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564056-4x5qk" Mar 18 14:16:04 crc kubenswrapper[4756]: I0318 14:16:04.823926 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99m44\" (UniqueName: \"kubernetes.io/projected/c833b21a-3aef-4e7e-9cf7-39e675c262ab-kube-api-access-99m44\") pod \"c833b21a-3aef-4e7e-9cf7-39e675c262ab\" (UID: \"c833b21a-3aef-4e7e-9cf7-39e675c262ab\") " Mar 18 14:16:04 crc kubenswrapper[4756]: I0318 14:16:04.828688 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c833b21a-3aef-4e7e-9cf7-39e675c262ab-kube-api-access-99m44" (OuterVolumeSpecName: "kube-api-access-99m44") pod "c833b21a-3aef-4e7e-9cf7-39e675c262ab" (UID: "c833b21a-3aef-4e7e-9cf7-39e675c262ab"). InnerVolumeSpecName "kube-api-access-99m44". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:16:04 crc kubenswrapper[4756]: I0318 14:16:04.926168 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99m44\" (UniqueName: \"kubernetes.io/projected/c833b21a-3aef-4e7e-9cf7-39e675c262ab-kube-api-access-99m44\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:05 crc kubenswrapper[4756]: I0318 14:16:05.296183 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564056-4x5qk" event={"ID":"c833b21a-3aef-4e7e-9cf7-39e675c262ab","Type":"ContainerDied","Data":"90e59f713721c65f8119895e4321b42e1bf7aa3be63b8c12e70baf26cfea8d0a"} Mar 18 14:16:05 crc kubenswrapper[4756]: I0318 14:16:05.296244 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90e59f713721c65f8119895e4321b42e1bf7aa3be63b8c12e70baf26cfea8d0a" Mar 18 14:16:05 crc kubenswrapper[4756]: I0318 14:16:05.296276 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564056-4x5qk" Mar 18 14:16:05 crc kubenswrapper[4756]: I0318 14:16:05.743004 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564050-f6gfb"] Mar 18 14:16:05 crc kubenswrapper[4756]: I0318 14:16:05.748700 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564050-f6gfb"] Mar 18 14:16:07 crc kubenswrapper[4756]: I0318 14:16:07.323678 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21801a7f-54fd-4ea9-92d5-be66dce1326d" path="/var/lib/kubelet/pods/21801a7f-54fd-4ea9-92d5-be66dce1326d/volumes" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.251788 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-k5xg9" podUID="f88a8bdd-954f-455c-aad1-03b1988afa37" containerName="console" containerID="cri-o://6977df527fac0f6501afd430cd14031bd2aefe17d2553bfa0d03e71bea89d5b4" gracePeriod=15 Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.645545 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-k5xg9_f88a8bdd-954f-455c-aad1-03b1988afa37/console/0.log" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.645840 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.841525 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-service-ca\") pod \"f88a8bdd-954f-455c-aad1-03b1988afa37\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.841650 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f88a8bdd-954f-455c-aad1-03b1988afa37-console-oauth-config\") pod \"f88a8bdd-954f-455c-aad1-03b1988afa37\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.841689 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-oauth-serving-cert\") pod \"f88a8bdd-954f-455c-aad1-03b1988afa37\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.841762 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-console-config\") pod \"f88a8bdd-954f-455c-aad1-03b1988afa37\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.841788 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-trusted-ca-bundle\") pod \"f88a8bdd-954f-455c-aad1-03b1988afa37\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.841814 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx8lv\" (UniqueName: \"kubernetes.io/projected/f88a8bdd-954f-455c-aad1-03b1988afa37-kube-api-access-dx8lv\") pod \"f88a8bdd-954f-455c-aad1-03b1988afa37\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.841869 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f88a8bdd-954f-455c-aad1-03b1988afa37-console-serving-cert\") pod \"f88a8bdd-954f-455c-aad1-03b1988afa37\" (UID: \"f88a8bdd-954f-455c-aad1-03b1988afa37\") " Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.842646 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f88a8bdd-954f-455c-aad1-03b1988afa37" (UID: "f88a8bdd-954f-455c-aad1-03b1988afa37"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.842657 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-service-ca" (OuterVolumeSpecName: "service-ca") pod "f88a8bdd-954f-455c-aad1-03b1988afa37" (UID: "f88a8bdd-954f-455c-aad1-03b1988afa37"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.843044 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f88a8bdd-954f-455c-aad1-03b1988afa37" (UID: "f88a8bdd-954f-455c-aad1-03b1988afa37"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.843179 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-console-config" (OuterVolumeSpecName: "console-config") pod "f88a8bdd-954f-455c-aad1-03b1988afa37" (UID: "f88a8bdd-954f-455c-aad1-03b1988afa37"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.849267 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88a8bdd-954f-455c-aad1-03b1988afa37-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f88a8bdd-954f-455c-aad1-03b1988afa37" (UID: "f88a8bdd-954f-455c-aad1-03b1988afa37"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.849731 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88a8bdd-954f-455c-aad1-03b1988afa37-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f88a8bdd-954f-455c-aad1-03b1988afa37" (UID: "f88a8bdd-954f-455c-aad1-03b1988afa37"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.849945 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88a8bdd-954f-455c-aad1-03b1988afa37-kube-api-access-dx8lv" (OuterVolumeSpecName: "kube-api-access-dx8lv") pod "f88a8bdd-954f-455c-aad1-03b1988afa37" (UID: "f88a8bdd-954f-455c-aad1-03b1988afa37"). InnerVolumeSpecName "kube-api-access-dx8lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.943461 4756 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.943515 4756 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.943540 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx8lv\" (UniqueName: \"kubernetes.io/projected/f88a8bdd-954f-455c-aad1-03b1988afa37-kube-api-access-dx8lv\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.943565 4756 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f88a8bdd-954f-455c-aad1-03b1988afa37-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.943588 4756 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.943612 4756 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f88a8bdd-954f-455c-aad1-03b1988afa37-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:12 crc kubenswrapper[4756]: I0318 14:16:12.943634 4756 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f88a8bdd-954f-455c-aad1-03b1988afa37-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:13 crc kubenswrapper[4756]: I0318 14:16:13.357931 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-k5xg9_f88a8bdd-954f-455c-aad1-03b1988afa37/console/0.log" Mar 18 14:16:13 crc kubenswrapper[4756]: I0318 14:16:13.357981 4756 generic.go:334] "Generic (PLEG): container finished" podID="f88a8bdd-954f-455c-aad1-03b1988afa37" containerID="6977df527fac0f6501afd430cd14031bd2aefe17d2553bfa0d03e71bea89d5b4" exitCode=2 Mar 18 14:16:13 crc kubenswrapper[4756]: I0318 14:16:13.358013 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k5xg9" event={"ID":"f88a8bdd-954f-455c-aad1-03b1988afa37","Type":"ContainerDied","Data":"6977df527fac0f6501afd430cd14031bd2aefe17d2553bfa0d03e71bea89d5b4"} Mar 18 14:16:13 crc kubenswrapper[4756]: I0318 14:16:13.358042 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k5xg9" event={"ID":"f88a8bdd-954f-455c-aad1-03b1988afa37","Type":"ContainerDied","Data":"22e3b1e94cef2835a0cfbc2dc20b571b8dc0198acbfab345e7e9b7ebba9d182c"} Mar 18 14:16:13 crc kubenswrapper[4756]: I0318 14:16:13.358062 4756 scope.go:117] "RemoveContainer" containerID="6977df527fac0f6501afd430cd14031bd2aefe17d2553bfa0d03e71bea89d5b4" Mar 18 14:16:13 crc kubenswrapper[4756]: I0318 14:16:13.358205 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k5xg9" Mar 18 14:16:13 crc kubenswrapper[4756]: I0318 14:16:13.389291 4756 scope.go:117] "RemoveContainer" containerID="6977df527fac0f6501afd430cd14031bd2aefe17d2553bfa0d03e71bea89d5b4" Mar 18 14:16:13 crc kubenswrapper[4756]: I0318 14:16:13.390861 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-k5xg9"] Mar 18 14:16:13 crc kubenswrapper[4756]: E0318 14:16:13.393447 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6977df527fac0f6501afd430cd14031bd2aefe17d2553bfa0d03e71bea89d5b4\": container with ID starting with 6977df527fac0f6501afd430cd14031bd2aefe17d2553bfa0d03e71bea89d5b4 not found: ID does not exist" containerID="6977df527fac0f6501afd430cd14031bd2aefe17d2553bfa0d03e71bea89d5b4" Mar 18 14:16:13 crc kubenswrapper[4756]: I0318 14:16:13.393491 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6977df527fac0f6501afd430cd14031bd2aefe17d2553bfa0d03e71bea89d5b4"} err="failed to get container status \"6977df527fac0f6501afd430cd14031bd2aefe17d2553bfa0d03e71bea89d5b4\": rpc error: code = NotFound desc = could not find container \"6977df527fac0f6501afd430cd14031bd2aefe17d2553bfa0d03e71bea89d5b4\": container with ID starting with 6977df527fac0f6501afd430cd14031bd2aefe17d2553bfa0d03e71bea89d5b4 not found: ID does not exist" Mar 18 14:16:13 crc kubenswrapper[4756]: I0318 14:16:13.399045 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-k5xg9"] Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.357087 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f"] Mar 18 14:16:14 crc kubenswrapper[4756]: E0318 14:16:14.357340 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c833b21a-3aef-4e7e-9cf7-39e675c262ab" containerName="oc" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.357357 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c833b21a-3aef-4e7e-9cf7-39e675c262ab" containerName="oc" Mar 18 14:16:14 crc kubenswrapper[4756]: E0318 14:16:14.357370 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f88a8bdd-954f-455c-aad1-03b1988afa37" containerName="console" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.357377 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f88a8bdd-954f-455c-aad1-03b1988afa37" containerName="console" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.357489 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c833b21a-3aef-4e7e-9cf7-39e675c262ab" containerName="oc" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.357505 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f88a8bdd-954f-455c-aad1-03b1988afa37" containerName="console" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.358244 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.362734 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.373793 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f"] Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.469126 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8383b75-db63-4c4e-b9c1-3fad8e459899-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f\" (UID: \"d8383b75-db63-4c4e-b9c1-3fad8e459899\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.469198 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8383b75-db63-4c4e-b9c1-3fad8e459899-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f\" (UID: \"d8383b75-db63-4c4e-b9c1-3fad8e459899\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.469279 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h87ng\" (UniqueName: \"kubernetes.io/projected/d8383b75-db63-4c4e-b9c1-3fad8e459899-kube-api-access-h87ng\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f\" (UID: \"d8383b75-db63-4c4e-b9c1-3fad8e459899\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.570463 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h87ng\" (UniqueName: \"kubernetes.io/projected/d8383b75-db63-4c4e-b9c1-3fad8e459899-kube-api-access-h87ng\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f\" (UID: \"d8383b75-db63-4c4e-b9c1-3fad8e459899\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.570512 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8383b75-db63-4c4e-b9c1-3fad8e459899-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f\" (UID: \"d8383b75-db63-4c4e-b9c1-3fad8e459899\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.570557 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8383b75-db63-4c4e-b9c1-3fad8e459899-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f\" (UID: \"d8383b75-db63-4c4e-b9c1-3fad8e459899\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.570988 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8383b75-db63-4c4e-b9c1-3fad8e459899-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f\" (UID: \"d8383b75-db63-4c4e-b9c1-3fad8e459899\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.571137 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8383b75-db63-4c4e-b9c1-3fad8e459899-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f\" (UID: \"d8383b75-db63-4c4e-b9c1-3fad8e459899\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.597260 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h87ng\" (UniqueName: \"kubernetes.io/projected/d8383b75-db63-4c4e-b9c1-3fad8e459899-kube-api-access-h87ng\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f\" (UID: \"d8383b75-db63-4c4e-b9c1-3fad8e459899\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.682802 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" Mar 18 14:16:14 crc kubenswrapper[4756]: I0318 14:16:14.920614 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f"] Mar 18 14:16:15 crc kubenswrapper[4756]: I0318 14:16:15.321599 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88a8bdd-954f-455c-aad1-03b1988afa37" path="/var/lib/kubelet/pods/f88a8bdd-954f-455c-aad1-03b1988afa37/volumes" Mar 18 14:16:15 crc kubenswrapper[4756]: I0318 14:16:15.389981 4756 generic.go:334] "Generic (PLEG): container finished" podID="d8383b75-db63-4c4e-b9c1-3fad8e459899" containerID="b0246dcc07b1bad2d9e01b7aa6db8d6c1feef83696157fc98555b5a9270f74ad" exitCode=0 Mar 18 14:16:15 crc kubenswrapper[4756]: I0318 14:16:15.390023 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" event={"ID":"d8383b75-db63-4c4e-b9c1-3fad8e459899","Type":"ContainerDied","Data":"b0246dcc07b1bad2d9e01b7aa6db8d6c1feef83696157fc98555b5a9270f74ad"} Mar 18 14:16:15 crc kubenswrapper[4756]: I0318 14:16:15.390052 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" event={"ID":"d8383b75-db63-4c4e-b9c1-3fad8e459899","Type":"ContainerStarted","Data":"e6616dbd2de51b36f6efa73b60bc4d3cac6817cab020fb3e6ab8a4c297822f0a"} Mar 18 14:16:17 crc kubenswrapper[4756]: I0318 14:16:17.404423 4756 generic.go:334] "Generic (PLEG): container finished" podID="d8383b75-db63-4c4e-b9c1-3fad8e459899" containerID="edb538c2a46a745afb183541e9bbeb1f1bd0f462fe92decad183352582ec9e9e" exitCode=0 Mar 18 14:16:17 crc kubenswrapper[4756]: I0318 14:16:17.404792 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" event={"ID":"d8383b75-db63-4c4e-b9c1-3fad8e459899","Type":"ContainerDied","Data":"edb538c2a46a745afb183541e9bbeb1f1bd0f462fe92decad183352582ec9e9e"} Mar 18 14:16:18 crc kubenswrapper[4756]: I0318 14:16:18.417680 4756 generic.go:334] "Generic (PLEG): container finished" podID="d8383b75-db63-4c4e-b9c1-3fad8e459899" containerID="8778ba3ecea1a4465602c3f9ed40876e789ab181c8760ca474ae4dbe2c64085f" exitCode=0 Mar 18 14:16:18 crc kubenswrapper[4756]: I0318 14:16:18.417790 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" event={"ID":"d8383b75-db63-4c4e-b9c1-3fad8e459899","Type":"ContainerDied","Data":"8778ba3ecea1a4465602c3f9ed40876e789ab181c8760ca474ae4dbe2c64085f"} Mar 18 14:16:19 crc kubenswrapper[4756]: I0318 14:16:19.730736 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" Mar 18 14:16:19 crc kubenswrapper[4756]: I0318 14:16:19.851905 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8383b75-db63-4c4e-b9c1-3fad8e459899-bundle\") pod \"d8383b75-db63-4c4e-b9c1-3fad8e459899\" (UID: \"d8383b75-db63-4c4e-b9c1-3fad8e459899\") " Mar 18 14:16:19 crc kubenswrapper[4756]: I0318 14:16:19.852044 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8383b75-db63-4c4e-b9c1-3fad8e459899-util\") pod \"d8383b75-db63-4c4e-b9c1-3fad8e459899\" (UID: \"d8383b75-db63-4c4e-b9c1-3fad8e459899\") " Mar 18 14:16:19 crc kubenswrapper[4756]: I0318 14:16:19.852105 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h87ng\" (UniqueName: \"kubernetes.io/projected/d8383b75-db63-4c4e-b9c1-3fad8e459899-kube-api-access-h87ng\") pod \"d8383b75-db63-4c4e-b9c1-3fad8e459899\" (UID: \"d8383b75-db63-4c4e-b9c1-3fad8e459899\") " Mar 18 14:16:19 crc kubenswrapper[4756]: I0318 14:16:19.853009 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8383b75-db63-4c4e-b9c1-3fad8e459899-bundle" (OuterVolumeSpecName: "bundle") pod "d8383b75-db63-4c4e-b9c1-3fad8e459899" (UID: "d8383b75-db63-4c4e-b9c1-3fad8e459899"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:16:19 crc kubenswrapper[4756]: I0318 14:16:19.857810 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8383b75-db63-4c4e-b9c1-3fad8e459899-kube-api-access-h87ng" (OuterVolumeSpecName: "kube-api-access-h87ng") pod "d8383b75-db63-4c4e-b9c1-3fad8e459899" (UID: "d8383b75-db63-4c4e-b9c1-3fad8e459899"). InnerVolumeSpecName "kube-api-access-h87ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:16:19 crc kubenswrapper[4756]: I0318 14:16:19.869937 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8383b75-db63-4c4e-b9c1-3fad8e459899-util" (OuterVolumeSpecName: "util") pod "d8383b75-db63-4c4e-b9c1-3fad8e459899" (UID: "d8383b75-db63-4c4e-b9c1-3fad8e459899"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:16:19 crc kubenswrapper[4756]: I0318 14:16:19.953685 4756 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8383b75-db63-4c4e-b9c1-3fad8e459899-util\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:19 crc kubenswrapper[4756]: I0318 14:16:19.954004 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h87ng\" (UniqueName: \"kubernetes.io/projected/d8383b75-db63-4c4e-b9c1-3fad8e459899-kube-api-access-h87ng\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:19 crc kubenswrapper[4756]: I0318 14:16:19.954141 4756 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8383b75-db63-4c4e-b9c1-3fad8e459899-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:20 crc kubenswrapper[4756]: I0318 14:16:20.418718 4756 scope.go:117] "RemoveContainer" containerID="868752be180b3ed97411de02996ed876c4709543caa7e4f3dbeeb6384b1cfd0d" Mar 18 14:16:20 crc kubenswrapper[4756]: I0318 14:16:20.445741 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" event={"ID":"d8383b75-db63-4c4e-b9c1-3fad8e459899","Type":"ContainerDied","Data":"e6616dbd2de51b36f6efa73b60bc4d3cac6817cab020fb3e6ab8a4c297822f0a"} Mar 18 14:16:20 crc kubenswrapper[4756]: I0318 14:16:20.445790 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6616dbd2de51b36f6efa73b60bc4d3cac6817cab020fb3e6ab8a4c297822f0a" Mar 18 14:16:20 crc kubenswrapper[4756]: I0318 14:16:20.445865 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.566940 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht"] Mar 18 14:16:28 crc kubenswrapper[4756]: E0318 14:16:28.567559 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8383b75-db63-4c4e-b9c1-3fad8e459899" containerName="util" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.567570 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8383b75-db63-4c4e-b9c1-3fad8e459899" containerName="util" Mar 18 14:16:28 crc kubenswrapper[4756]: E0318 14:16:28.567584 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8383b75-db63-4c4e-b9c1-3fad8e459899" containerName="extract" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.567590 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8383b75-db63-4c4e-b9c1-3fad8e459899" containerName="extract" Mar 18 14:16:28 crc kubenswrapper[4756]: E0318 14:16:28.567600 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8383b75-db63-4c4e-b9c1-3fad8e459899" containerName="pull" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.567607 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8383b75-db63-4c4e-b9c1-3fad8e459899" containerName="pull" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.567703 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8383b75-db63-4c4e-b9c1-3fad8e459899" containerName="extract" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.568073 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.570179 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.570404 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.570447 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lgvpr" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.570590 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.570853 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.586794 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht"] Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.671778 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsthd\" (UniqueName: \"kubernetes.io/projected/7f228f4e-f45a-4cf9-b7d0-b4f44a143c17-kube-api-access-qsthd\") pod \"metallb-operator-controller-manager-64f4bf856c-2qfht\" (UID: \"7f228f4e-f45a-4cf9-b7d0-b4f44a143c17\") " pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.671826 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f228f4e-f45a-4cf9-b7d0-b4f44a143c17-webhook-cert\") pod \"metallb-operator-controller-manager-64f4bf856c-2qfht\" (UID: \"7f228f4e-f45a-4cf9-b7d0-b4f44a143c17\") " pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.671851 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f228f4e-f45a-4cf9-b7d0-b4f44a143c17-apiservice-cert\") pod \"metallb-operator-controller-manager-64f4bf856c-2qfht\" (UID: \"7f228f4e-f45a-4cf9-b7d0-b4f44a143c17\") " pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.773517 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsthd\" (UniqueName: \"kubernetes.io/projected/7f228f4e-f45a-4cf9-b7d0-b4f44a143c17-kube-api-access-qsthd\") pod \"metallb-operator-controller-manager-64f4bf856c-2qfht\" (UID: \"7f228f4e-f45a-4cf9-b7d0-b4f44a143c17\") " pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.773579 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f228f4e-f45a-4cf9-b7d0-b4f44a143c17-webhook-cert\") pod \"metallb-operator-controller-manager-64f4bf856c-2qfht\" (UID: \"7f228f4e-f45a-4cf9-b7d0-b4f44a143c17\") " pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.773606 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f228f4e-f45a-4cf9-b7d0-b4f44a143c17-apiservice-cert\") pod \"metallb-operator-controller-manager-64f4bf856c-2qfht\" (UID: \"7f228f4e-f45a-4cf9-b7d0-b4f44a143c17\") " pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.779724 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f228f4e-f45a-4cf9-b7d0-b4f44a143c17-webhook-cert\") pod \"metallb-operator-controller-manager-64f4bf856c-2qfht\" (UID: \"7f228f4e-f45a-4cf9-b7d0-b4f44a143c17\") " pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.779789 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f228f4e-f45a-4cf9-b7d0-b4f44a143c17-apiservice-cert\") pod \"metallb-operator-controller-manager-64f4bf856c-2qfht\" (UID: \"7f228f4e-f45a-4cf9-b7d0-b4f44a143c17\") " pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.793185 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsthd\" (UniqueName: \"kubernetes.io/projected/7f228f4e-f45a-4cf9-b7d0-b4f44a143c17-kube-api-access-qsthd\") pod \"metallb-operator-controller-manager-64f4bf856c-2qfht\" (UID: \"7f228f4e-f45a-4cf9-b7d0-b4f44a143c17\") " pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" Mar 18 14:16:28 crc kubenswrapper[4756]: I0318 14:16:28.883042 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.015452 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg"] Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.016286 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.022812 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.023095 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.023230 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-22t8d" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.028488 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg"] Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.178365 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e9fab363-b0f2-4beb-ae91-d3cfdac9e407-apiservice-cert\") pod \"metallb-operator-webhook-server-7989c5c544-j8vcg\" (UID: \"e9fab363-b0f2-4beb-ae91-d3cfdac9e407\") " pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.178424 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6mmw\" (UniqueName: \"kubernetes.io/projected/e9fab363-b0f2-4beb-ae91-d3cfdac9e407-kube-api-access-r6mmw\") pod \"metallb-operator-webhook-server-7989c5c544-j8vcg\" (UID: \"e9fab363-b0f2-4beb-ae91-d3cfdac9e407\") " pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.178449 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e9fab363-b0f2-4beb-ae91-d3cfdac9e407-webhook-cert\") pod \"metallb-operator-webhook-server-7989c5c544-j8vcg\" (UID: \"e9fab363-b0f2-4beb-ae91-d3cfdac9e407\") " pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.279493 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e9fab363-b0f2-4beb-ae91-d3cfdac9e407-apiservice-cert\") pod \"metallb-operator-webhook-server-7989c5c544-j8vcg\" (UID: \"e9fab363-b0f2-4beb-ae91-d3cfdac9e407\") " pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.279553 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6mmw\" (UniqueName: \"kubernetes.io/projected/e9fab363-b0f2-4beb-ae91-d3cfdac9e407-kube-api-access-r6mmw\") pod \"metallb-operator-webhook-server-7989c5c544-j8vcg\" (UID: \"e9fab363-b0f2-4beb-ae91-d3cfdac9e407\") " pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.279587 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e9fab363-b0f2-4beb-ae91-d3cfdac9e407-webhook-cert\") pod \"metallb-operator-webhook-server-7989c5c544-j8vcg\" (UID: \"e9fab363-b0f2-4beb-ae91-d3cfdac9e407\") " pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.283078 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e9fab363-b0f2-4beb-ae91-d3cfdac9e407-webhook-cert\") pod \"metallb-operator-webhook-server-7989c5c544-j8vcg\" (UID: \"e9fab363-b0f2-4beb-ae91-d3cfdac9e407\") " pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.298703 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e9fab363-b0f2-4beb-ae91-d3cfdac9e407-apiservice-cert\") pod \"metallb-operator-webhook-server-7989c5c544-j8vcg\" (UID: \"e9fab363-b0f2-4beb-ae91-d3cfdac9e407\") " pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.300709 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6mmw\" (UniqueName: \"kubernetes.io/projected/e9fab363-b0f2-4beb-ae91-d3cfdac9e407-kube-api-access-r6mmw\") pod \"metallb-operator-webhook-server-7989c5c544-j8vcg\" (UID: \"e9fab363-b0f2-4beb-ae91-d3cfdac9e407\") " pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.354610 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.366990 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht"] Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.513215 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" event={"ID":"7f228f4e-f45a-4cf9-b7d0-b4f44a143c17","Type":"ContainerStarted","Data":"32061df64f4596b5723ecf6e22e5b5209c51bce7881fff25e5752edc5fb73073"} Mar 18 14:16:29 crc kubenswrapper[4756]: I0318 14:16:29.542938 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg"] Mar 18 14:16:29 crc kubenswrapper[4756]: W0318 14:16:29.548888 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9fab363_b0f2_4beb_ae91_d3cfdac9e407.slice/crio-bf8003869bd5655163769b30c06d8ae8042a404c9080064ed26e52167ae1445b WatchSource:0}: Error finding container bf8003869bd5655163769b30c06d8ae8042a404c9080064ed26e52167ae1445b: Status 404 returned error can't find the container with id bf8003869bd5655163769b30c06d8ae8042a404c9080064ed26e52167ae1445b Mar 18 14:16:30 crc kubenswrapper[4756]: I0318 14:16:30.528555 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" event={"ID":"e9fab363-b0f2-4beb-ae91-d3cfdac9e407","Type":"ContainerStarted","Data":"bf8003869bd5655163769b30c06d8ae8042a404c9080064ed26e52167ae1445b"} Mar 18 14:16:35 crc kubenswrapper[4756]: I0318 14:16:35.565237 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" event={"ID":"e9fab363-b0f2-4beb-ae91-d3cfdac9e407","Type":"ContainerStarted","Data":"43b417212e09b389f10f27ef3fd353c70ba60bf1f78e6c34134796c829ca8fe2"} Mar 18 14:16:35 crc kubenswrapper[4756]: I0318 14:16:35.566037 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" Mar 18 14:16:35 crc kubenswrapper[4756]: I0318 14:16:35.568883 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" event={"ID":"7f228f4e-f45a-4cf9-b7d0-b4f44a143c17","Type":"ContainerStarted","Data":"f817d025af8619b253d9866eca2179b3a24dcefe5502a9f22f740f0065eb4399"} Mar 18 14:16:35 crc kubenswrapper[4756]: I0318 14:16:35.569351 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" Mar 18 14:16:35 crc kubenswrapper[4756]: I0318 14:16:35.621959 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" podStartSLOduration=2.77277375 podStartE2EDuration="7.621936545s" podCreationTimestamp="2026-03-18 14:16:28 +0000 UTC" firstStartedPulling="2026-03-18 14:16:29.552230095 +0000 UTC m=+990.866648070" lastFinishedPulling="2026-03-18 14:16:34.40139289 +0000 UTC m=+995.715810865" observedRunningTime="2026-03-18 14:16:35.602638854 +0000 UTC m=+996.917056849" watchObservedRunningTime="2026-03-18 14:16:35.621936545 +0000 UTC m=+996.936354530" Mar 18 14:16:35 crc kubenswrapper[4756]: I0318 14:16:35.633326 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" podStartSLOduration=2.632673698 podStartE2EDuration="7.633304209s" podCreationTimestamp="2026-03-18 14:16:28 +0000 UTC" firstStartedPulling="2026-03-18 14:16:29.381138848 +0000 UTC m=+990.695556823" lastFinishedPulling="2026-03-18 14:16:34.381769349 +0000 UTC m=+995.696187334" observedRunningTime="2026-03-18 14:16:35.628528908 +0000 UTC m=+996.942946893" watchObservedRunningTime="2026-03-18 14:16:35.633304209 +0000 UTC m=+996.947722194" Mar 18 14:16:36 crc kubenswrapper[4756]: I0318 14:16:36.288878 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7gcpj"] Mar 18 14:16:36 crc kubenswrapper[4756]: I0318 14:16:36.290620 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:36 crc kubenswrapper[4756]: I0318 14:16:36.298135 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gcpj"] Mar 18 14:16:36 crc kubenswrapper[4756]: I0318 14:16:36.388972 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6907-72fe-449c-9f65-9aa100b67321-catalog-content\") pod \"redhat-marketplace-7gcpj\" (UID: \"2c0c6907-72fe-449c-9f65-9aa100b67321\") " pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:36 crc kubenswrapper[4756]: I0318 14:16:36.389074 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2n2v\" (UniqueName: \"kubernetes.io/projected/2c0c6907-72fe-449c-9f65-9aa100b67321-kube-api-access-g2n2v\") pod \"redhat-marketplace-7gcpj\" (UID: \"2c0c6907-72fe-449c-9f65-9aa100b67321\") " pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:36 crc kubenswrapper[4756]: I0318 14:16:36.389147 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6907-72fe-449c-9f65-9aa100b67321-utilities\") pod \"redhat-marketplace-7gcpj\" (UID: \"2c0c6907-72fe-449c-9f65-9aa100b67321\") " pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:36 crc kubenswrapper[4756]: I0318 14:16:36.490925 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6907-72fe-449c-9f65-9aa100b67321-catalog-content\") pod \"redhat-marketplace-7gcpj\" (UID: \"2c0c6907-72fe-449c-9f65-9aa100b67321\") " pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:36 crc kubenswrapper[4756]: I0318 14:16:36.491031 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2n2v\" (UniqueName: \"kubernetes.io/projected/2c0c6907-72fe-449c-9f65-9aa100b67321-kube-api-access-g2n2v\") pod \"redhat-marketplace-7gcpj\" (UID: \"2c0c6907-72fe-449c-9f65-9aa100b67321\") " pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:36 crc kubenswrapper[4756]: I0318 14:16:36.491076 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6907-72fe-449c-9f65-9aa100b67321-utilities\") pod \"redhat-marketplace-7gcpj\" (UID: \"2c0c6907-72fe-449c-9f65-9aa100b67321\") " pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:36 crc kubenswrapper[4756]: I0318 14:16:36.491838 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6907-72fe-449c-9f65-9aa100b67321-catalog-content\") pod \"redhat-marketplace-7gcpj\" (UID: \"2c0c6907-72fe-449c-9f65-9aa100b67321\") " pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:36 crc kubenswrapper[4756]: I0318 14:16:36.492336 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6907-72fe-449c-9f65-9aa100b67321-utilities\") pod \"redhat-marketplace-7gcpj\" (UID: \"2c0c6907-72fe-449c-9f65-9aa100b67321\") " pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:36 crc kubenswrapper[4756]: I0318 14:16:36.511428 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2n2v\" (UniqueName: \"kubernetes.io/projected/2c0c6907-72fe-449c-9f65-9aa100b67321-kube-api-access-g2n2v\") pod \"redhat-marketplace-7gcpj\" (UID: \"2c0c6907-72fe-449c-9f65-9aa100b67321\") " pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:36 crc kubenswrapper[4756]: I0318 14:16:36.614372 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:37 crc kubenswrapper[4756]: I0318 14:16:37.073714 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gcpj"] Mar 18 14:16:37 crc kubenswrapper[4756]: I0318 14:16:37.582989 4756 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6907-72fe-449c-9f65-9aa100b67321" containerID="0a89082c06f0a183616d701fcd3acdb0ca28ba1e8fe31c1e81e4da8ba95e9e4e" exitCode=0 Mar 18 14:16:37 crc kubenswrapper[4756]: I0318 14:16:37.583041 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gcpj" event={"ID":"2c0c6907-72fe-449c-9f65-9aa100b67321","Type":"ContainerDied","Data":"0a89082c06f0a183616d701fcd3acdb0ca28ba1e8fe31c1e81e4da8ba95e9e4e"} Mar 18 14:16:37 crc kubenswrapper[4756]: I0318 14:16:37.583070 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gcpj" event={"ID":"2c0c6907-72fe-449c-9f65-9aa100b67321","Type":"ContainerStarted","Data":"ac16c04939baeb69375764bcb837e755b3427ae3947f7140eda72c6d28998139"} Mar 18 14:16:38 crc kubenswrapper[4756]: I0318 14:16:38.592044 4756 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6907-72fe-449c-9f65-9aa100b67321" containerID="160cd778b4aba73e00e9c9d19cafdebad39662b1aecccf7da5c6020cc789ac5d" exitCode=0 Mar 18 14:16:38 crc kubenswrapper[4756]: I0318 14:16:38.592111 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gcpj" event={"ID":"2c0c6907-72fe-449c-9f65-9aa100b67321","Type":"ContainerDied","Data":"160cd778b4aba73e00e9c9d19cafdebad39662b1aecccf7da5c6020cc789ac5d"} Mar 18 14:16:39 crc kubenswrapper[4756]: I0318 14:16:39.603297 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gcpj" event={"ID":"2c0c6907-72fe-449c-9f65-9aa100b67321","Type":"ContainerStarted","Data":"e8a9e40ed6c56cbdfe27def9b6cab37d28589c5fe23e1c979f5d5b91a9c5d5bd"} Mar 18 14:16:39 crc kubenswrapper[4756]: I0318 14:16:39.624891 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7gcpj" podStartSLOduration=2.208317233 podStartE2EDuration="3.624865054s" podCreationTimestamp="2026-03-18 14:16:36 +0000 UTC" firstStartedPulling="2026-03-18 14:16:37.584676667 +0000 UTC m=+998.899094652" lastFinishedPulling="2026-03-18 14:16:39.001224498 +0000 UTC m=+1000.315642473" observedRunningTime="2026-03-18 14:16:39.621581534 +0000 UTC m=+1000.935999559" watchObservedRunningTime="2026-03-18 14:16:39.624865054 +0000 UTC m=+1000.939283069" Mar 18 14:16:46 crc kubenswrapper[4756]: I0318 14:16:46.615168 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:46 crc kubenswrapper[4756]: I0318 14:16:46.615653 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:46 crc kubenswrapper[4756]: I0318 14:16:46.670389 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:46 crc kubenswrapper[4756]: I0318 14:16:46.711756 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.063836 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gcpj"] Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.064412 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7gcpj" podUID="2c0c6907-72fe-449c-9f65-9aa100b67321" containerName="registry-server" containerID="cri-o://e8a9e40ed6c56cbdfe27def9b6cab37d28589c5fe23e1c979f5d5b91a9c5d5bd" gracePeriod=2 Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.363396 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7989c5c544-j8vcg" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.461470 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.652106 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6907-72fe-449c-9f65-9aa100b67321-utilities\") pod \"2c0c6907-72fe-449c-9f65-9aa100b67321\" (UID: \"2c0c6907-72fe-449c-9f65-9aa100b67321\") " Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.652212 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2n2v\" (UniqueName: \"kubernetes.io/projected/2c0c6907-72fe-449c-9f65-9aa100b67321-kube-api-access-g2n2v\") pod \"2c0c6907-72fe-449c-9f65-9aa100b67321\" (UID: \"2c0c6907-72fe-449c-9f65-9aa100b67321\") " Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.652246 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6907-72fe-449c-9f65-9aa100b67321-catalog-content\") pod \"2c0c6907-72fe-449c-9f65-9aa100b67321\" (UID: \"2c0c6907-72fe-449c-9f65-9aa100b67321\") " Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.652888 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c0c6907-72fe-449c-9f65-9aa100b67321-utilities" (OuterVolumeSpecName: "utilities") pod "2c0c6907-72fe-449c-9f65-9aa100b67321" (UID: "2c0c6907-72fe-449c-9f65-9aa100b67321"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.663657 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c0c6907-72fe-449c-9f65-9aa100b67321-kube-api-access-g2n2v" (OuterVolumeSpecName: "kube-api-access-g2n2v") pod "2c0c6907-72fe-449c-9f65-9aa100b67321" (UID: "2c0c6907-72fe-449c-9f65-9aa100b67321"). InnerVolumeSpecName "kube-api-access-g2n2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.674502 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c0c6907-72fe-449c-9f65-9aa100b67321-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c0c6907-72fe-449c-9f65-9aa100b67321" (UID: "2c0c6907-72fe-449c-9f65-9aa100b67321"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.683021 4756 generic.go:334] "Generic (PLEG): container finished" podID="2c0c6907-72fe-449c-9f65-9aa100b67321" containerID="e8a9e40ed6c56cbdfe27def9b6cab37d28589c5fe23e1c979f5d5b91a9c5d5bd" exitCode=0 Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.683057 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gcpj" event={"ID":"2c0c6907-72fe-449c-9f65-9aa100b67321","Type":"ContainerDied","Data":"e8a9e40ed6c56cbdfe27def9b6cab37d28589c5fe23e1c979f5d5b91a9c5d5bd"} Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.683083 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gcpj" event={"ID":"2c0c6907-72fe-449c-9f65-9aa100b67321","Type":"ContainerDied","Data":"ac16c04939baeb69375764bcb837e755b3427ae3947f7140eda72c6d28998139"} Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.683079 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gcpj" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.683099 4756 scope.go:117] "RemoveContainer" containerID="e8a9e40ed6c56cbdfe27def9b6cab37d28589c5fe23e1c979f5d5b91a9c5d5bd" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.708561 4756 scope.go:117] "RemoveContainer" containerID="160cd778b4aba73e00e9c9d19cafdebad39662b1aecccf7da5c6020cc789ac5d" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.711856 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gcpj"] Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.716664 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gcpj"] Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.742054 4756 scope.go:117] "RemoveContainer" containerID="0a89082c06f0a183616d701fcd3acdb0ca28ba1e8fe31c1e81e4da8ba95e9e4e" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.753237 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6907-72fe-449c-9f65-9aa100b67321-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.753267 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0c6907-72fe-449c-9f65-9aa100b67321-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.753287 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2n2v\" (UniqueName: \"kubernetes.io/projected/2c0c6907-72fe-449c-9f65-9aa100b67321-kube-api-access-g2n2v\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.772331 4756 scope.go:117] "RemoveContainer" containerID="e8a9e40ed6c56cbdfe27def9b6cab37d28589c5fe23e1c979f5d5b91a9c5d5bd" Mar 18 14:16:49 crc kubenswrapper[4756]: E0318 14:16:49.772739 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8a9e40ed6c56cbdfe27def9b6cab37d28589c5fe23e1c979f5d5b91a9c5d5bd\": container with ID starting with e8a9e40ed6c56cbdfe27def9b6cab37d28589c5fe23e1c979f5d5b91a9c5d5bd not found: ID does not exist" containerID="e8a9e40ed6c56cbdfe27def9b6cab37d28589c5fe23e1c979f5d5b91a9c5d5bd" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.772784 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a9e40ed6c56cbdfe27def9b6cab37d28589c5fe23e1c979f5d5b91a9c5d5bd"} err="failed to get container status \"e8a9e40ed6c56cbdfe27def9b6cab37d28589c5fe23e1c979f5d5b91a9c5d5bd\": rpc error: code = NotFound desc = could not find container \"e8a9e40ed6c56cbdfe27def9b6cab37d28589c5fe23e1c979f5d5b91a9c5d5bd\": container with ID starting with e8a9e40ed6c56cbdfe27def9b6cab37d28589c5fe23e1c979f5d5b91a9c5d5bd not found: ID does not exist" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.773228 4756 scope.go:117] "RemoveContainer" containerID="160cd778b4aba73e00e9c9d19cafdebad39662b1aecccf7da5c6020cc789ac5d" Mar 18 14:16:49 crc kubenswrapper[4756]: E0318 14:16:49.773581 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"160cd778b4aba73e00e9c9d19cafdebad39662b1aecccf7da5c6020cc789ac5d\": container with ID starting with 160cd778b4aba73e00e9c9d19cafdebad39662b1aecccf7da5c6020cc789ac5d not found: ID does not exist" containerID="160cd778b4aba73e00e9c9d19cafdebad39662b1aecccf7da5c6020cc789ac5d" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.773613 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"160cd778b4aba73e00e9c9d19cafdebad39662b1aecccf7da5c6020cc789ac5d"} err="failed to get container status \"160cd778b4aba73e00e9c9d19cafdebad39662b1aecccf7da5c6020cc789ac5d\": rpc error: code = NotFound desc = could not find container \"160cd778b4aba73e00e9c9d19cafdebad39662b1aecccf7da5c6020cc789ac5d\": container with ID starting with 160cd778b4aba73e00e9c9d19cafdebad39662b1aecccf7da5c6020cc789ac5d not found: ID does not exist" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.773633 4756 scope.go:117] "RemoveContainer" containerID="0a89082c06f0a183616d701fcd3acdb0ca28ba1e8fe31c1e81e4da8ba95e9e4e" Mar 18 14:16:49 crc kubenswrapper[4756]: E0318 14:16:49.773978 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a89082c06f0a183616d701fcd3acdb0ca28ba1e8fe31c1e81e4da8ba95e9e4e\": container with ID starting with 0a89082c06f0a183616d701fcd3acdb0ca28ba1e8fe31c1e81e4da8ba95e9e4e not found: ID does not exist" containerID="0a89082c06f0a183616d701fcd3acdb0ca28ba1e8fe31c1e81e4da8ba95e9e4e" Mar 18 14:16:49 crc kubenswrapper[4756]: I0318 14:16:49.774009 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a89082c06f0a183616d701fcd3acdb0ca28ba1e8fe31c1e81e4da8ba95e9e4e"} err="failed to get container status \"0a89082c06f0a183616d701fcd3acdb0ca28ba1e8fe31c1e81e4da8ba95e9e4e\": rpc error: code = NotFound desc = could not find container \"0a89082c06f0a183616d701fcd3acdb0ca28ba1e8fe31c1e81e4da8ba95e9e4e\": container with ID starting with 0a89082c06f0a183616d701fcd3acdb0ca28ba1e8fe31c1e81e4da8ba95e9e4e not found: ID does not exist" Mar 18 14:16:51 crc kubenswrapper[4756]: I0318 14:16:51.327941 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c0c6907-72fe-449c-9f65-9aa100b67321" path="/var/lib/kubelet/pods/2c0c6907-72fe-449c-9f65-9aa100b67321/volumes" Mar 18 14:16:56 crc kubenswrapper[4756]: I0318 14:16:56.889927 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lfxgb"] Mar 18 14:16:56 crc kubenswrapper[4756]: E0318 14:16:56.891161 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6907-72fe-449c-9f65-9aa100b67321" containerName="extract-utilities" Mar 18 14:16:56 crc kubenswrapper[4756]: I0318 14:16:56.891189 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6907-72fe-449c-9f65-9aa100b67321" containerName="extract-utilities" Mar 18 14:16:56 crc kubenswrapper[4756]: E0318 14:16:56.891217 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6907-72fe-449c-9f65-9aa100b67321" containerName="extract-content" Mar 18 14:16:56 crc kubenswrapper[4756]: I0318 14:16:56.891230 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6907-72fe-449c-9f65-9aa100b67321" containerName="extract-content" Mar 18 14:16:56 crc kubenswrapper[4756]: E0318 14:16:56.891255 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0c6907-72fe-449c-9f65-9aa100b67321" containerName="registry-server" Mar 18 14:16:56 crc kubenswrapper[4756]: I0318 14:16:56.891270 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0c6907-72fe-449c-9f65-9aa100b67321" containerName="registry-server" Mar 18 14:16:56 crc kubenswrapper[4756]: I0318 14:16:56.891456 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0c6907-72fe-449c-9f65-9aa100b67321" containerName="registry-server" Mar 18 14:16:56 crc kubenswrapper[4756]: I0318 14:16:56.893330 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:16:56 crc kubenswrapper[4756]: I0318 14:16:56.904977 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfxgb"] Mar 18 14:16:56 crc kubenswrapper[4756]: I0318 14:16:56.946930 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbx8q\" (UniqueName: \"kubernetes.io/projected/46664941-8110-49b6-98ae-039dddc28118-kube-api-access-jbx8q\") pod \"community-operators-lfxgb\" (UID: \"46664941-8110-49b6-98ae-039dddc28118\") " pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:16:56 crc kubenswrapper[4756]: I0318 14:16:56.947073 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46664941-8110-49b6-98ae-039dddc28118-utilities\") pod \"community-operators-lfxgb\" (UID: \"46664941-8110-49b6-98ae-039dddc28118\") " pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:16:56 crc kubenswrapper[4756]: I0318 14:16:56.947164 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46664941-8110-49b6-98ae-039dddc28118-catalog-content\") pod \"community-operators-lfxgb\" (UID: \"46664941-8110-49b6-98ae-039dddc28118\") " pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:16:57 crc kubenswrapper[4756]: I0318 14:16:57.048237 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbx8q\" (UniqueName: \"kubernetes.io/projected/46664941-8110-49b6-98ae-039dddc28118-kube-api-access-jbx8q\") pod \"community-operators-lfxgb\" (UID: \"46664941-8110-49b6-98ae-039dddc28118\") " pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:16:57 crc kubenswrapper[4756]: I0318 14:16:57.048387 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46664941-8110-49b6-98ae-039dddc28118-utilities\") pod \"community-operators-lfxgb\" (UID: \"46664941-8110-49b6-98ae-039dddc28118\") " pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:16:57 crc kubenswrapper[4756]: I0318 14:16:57.048446 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46664941-8110-49b6-98ae-039dddc28118-catalog-content\") pod \"community-operators-lfxgb\" (UID: \"46664941-8110-49b6-98ae-039dddc28118\") " pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:16:57 crc kubenswrapper[4756]: I0318 14:16:57.048856 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46664941-8110-49b6-98ae-039dddc28118-utilities\") pod \"community-operators-lfxgb\" (UID: \"46664941-8110-49b6-98ae-039dddc28118\") " pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:16:57 crc kubenswrapper[4756]: I0318 14:16:57.049190 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46664941-8110-49b6-98ae-039dddc28118-catalog-content\") pod \"community-operators-lfxgb\" (UID: \"46664941-8110-49b6-98ae-039dddc28118\") " pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:16:57 crc kubenswrapper[4756]: I0318 14:16:57.085142 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbx8q\" (UniqueName: \"kubernetes.io/projected/46664941-8110-49b6-98ae-039dddc28118-kube-api-access-jbx8q\") pod \"community-operators-lfxgb\" (UID: \"46664941-8110-49b6-98ae-039dddc28118\") " pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:16:57 crc kubenswrapper[4756]: I0318 14:16:57.227672 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:16:57 crc kubenswrapper[4756]: I0318 14:16:57.709987 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfxgb"] Mar 18 14:16:57 crc kubenswrapper[4756]: W0318 14:16:57.721721 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46664941_8110_49b6_98ae_039dddc28118.slice/crio-7cfb6402d3cfc7139e3aca6b9bf3453ea66fbd9c0dcba992430454752f1abd21 WatchSource:0}: Error finding container 7cfb6402d3cfc7139e3aca6b9bf3453ea66fbd9c0dcba992430454752f1abd21: Status 404 returned error can't find the container with id 7cfb6402d3cfc7139e3aca6b9bf3453ea66fbd9c0dcba992430454752f1abd21 Mar 18 14:16:57 crc kubenswrapper[4756]: I0318 14:16:57.750371 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfxgb" event={"ID":"46664941-8110-49b6-98ae-039dddc28118","Type":"ContainerStarted","Data":"7cfb6402d3cfc7139e3aca6b9bf3453ea66fbd9c0dcba992430454752f1abd21"} Mar 18 14:16:58 crc kubenswrapper[4756]: I0318 14:16:58.759485 4756 generic.go:334] "Generic (PLEG): container finished" podID="46664941-8110-49b6-98ae-039dddc28118" containerID="05c6fb9bae89682621cbbb5da88cac0bf9925f07922be76d5dd1983a14683d7d" exitCode=0 Mar 18 14:16:58 crc kubenswrapper[4756]: I0318 14:16:58.759564 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfxgb" event={"ID":"46664941-8110-49b6-98ae-039dddc28118","Type":"ContainerDied","Data":"05c6fb9bae89682621cbbb5da88cac0bf9925f07922be76d5dd1983a14683d7d"} Mar 18 14:16:59 crc kubenswrapper[4756]: I0318 14:16:59.774918 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfxgb" event={"ID":"46664941-8110-49b6-98ae-039dddc28118","Type":"ContainerStarted","Data":"4f0470292ee752105fb0ac4205a45d0db7a5fc5829caabdf2db0a44cc13a4178"} Mar 18 14:17:00 crc kubenswrapper[4756]: I0318 14:17:00.782110 4756 generic.go:334] "Generic (PLEG): container finished" podID="46664941-8110-49b6-98ae-039dddc28118" containerID="4f0470292ee752105fb0ac4205a45d0db7a5fc5829caabdf2db0a44cc13a4178" exitCode=0 Mar 18 14:17:00 crc kubenswrapper[4756]: I0318 14:17:00.782462 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfxgb" event={"ID":"46664941-8110-49b6-98ae-039dddc28118","Type":"ContainerDied","Data":"4f0470292ee752105fb0ac4205a45d0db7a5fc5829caabdf2db0a44cc13a4178"} Mar 18 14:17:01 crc kubenswrapper[4756]: I0318 14:17:01.790199 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfxgb" event={"ID":"46664941-8110-49b6-98ae-039dddc28118","Type":"ContainerStarted","Data":"9703e0dda2564bf6b988d9f5afa9bbefe81688d7a551862b9b65ae2c3410a927"} Mar 18 14:17:01 crc kubenswrapper[4756]: I0318 14:17:01.805466 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lfxgb" podStartSLOduration=3.359892089 podStartE2EDuration="5.805441854s" podCreationTimestamp="2026-03-18 14:16:56 +0000 UTC" firstStartedPulling="2026-03-18 14:16:58.762093596 +0000 UTC m=+1020.076511591" lastFinishedPulling="2026-03-18 14:17:01.207643381 +0000 UTC m=+1022.522061356" observedRunningTime="2026-03-18 14:17:01.804438557 +0000 UTC m=+1023.118856552" watchObservedRunningTime="2026-03-18 14:17:01.805441854 +0000 UTC m=+1023.119859839" Mar 18 14:17:07 crc kubenswrapper[4756]: I0318 14:17:07.228800 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:17:07 crc kubenswrapper[4756]: I0318 14:17:07.230271 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:17:07 crc kubenswrapper[4756]: I0318 14:17:07.270609 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:17:07 crc kubenswrapper[4756]: I0318 14:17:07.915355 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:17:08 crc kubenswrapper[4756]: I0318 14:17:08.887249 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-64f4bf856c-2qfht" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.619398 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-fpdph"] Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.622423 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.628361 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.628374 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.628557 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4vpz9" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.630584 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl"] Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.631344 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.635659 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.647559 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl"] Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.721035 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5knbh"] Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.722012 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5knbh" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.723921 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.723935 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.724350 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.724734 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-mxk9n" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.741928 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-zwr45"] Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.743010 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-zwr45" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.745821 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.762515 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-zwr45"] Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.816962 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-reloader\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.817013 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-frr-startup\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.817040 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-frr-sockets\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.817065 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4663e088-ab8a-4f74-b7e9-f4d258772346-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jpbxl\" (UID: \"4663e088-ab8a-4f74-b7e9-f4d258772346\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.817145 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-frr-conf\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.817177 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd7zf\" (UniqueName: \"kubernetes.io/projected/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-kube-api-access-pd7zf\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.817220 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-metrics-certs\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.817251 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-metrics\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.817282 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69b86\" (UniqueName: \"kubernetes.io/projected/4663e088-ab8a-4f74-b7e9-f4d258772346-kube-api-access-69b86\") pod \"frr-k8s-webhook-server-bcc4b6f68-jpbxl\" (UID: \"4663e088-ab8a-4f74-b7e9-f4d258772346\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918252 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f7b4594e-5e27-402a-ab35-ff01fd5392eb-metallb-excludel2\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918299 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-metrics-certs\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918327 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-metrics\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918351 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f7b4594e-5e27-402a-ab35-ff01fd5392eb-memberlist\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918376 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69b86\" (UniqueName: \"kubernetes.io/projected/4663e088-ab8a-4f74-b7e9-f4d258772346-kube-api-access-69b86\") pod \"frr-k8s-webhook-server-bcc4b6f68-jpbxl\" (UID: \"4663e088-ab8a-4f74-b7e9-f4d258772346\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918428 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/660a6ced-cc4a-4447-91ca-aa0b46e7c6ef-metrics-certs\") pod \"controller-7bb4cc7c98-zwr45\" (UID: \"660a6ced-cc4a-4447-91ca-aa0b46e7c6ef\") " pod="metallb-system/controller-7bb4cc7c98-zwr45" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918448 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmr6p\" (UniqueName: \"kubernetes.io/projected/660a6ced-cc4a-4447-91ca-aa0b46e7c6ef-kube-api-access-tmr6p\") pod \"controller-7bb4cc7c98-zwr45\" (UID: \"660a6ced-cc4a-4447-91ca-aa0b46e7c6ef\") " pod="metallb-system/controller-7bb4cc7c98-zwr45" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918465 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-reloader\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918484 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-frr-startup\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918503 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-frr-sockets\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918520 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4663e088-ab8a-4f74-b7e9-f4d258772346-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jpbxl\" (UID: \"4663e088-ab8a-4f74-b7e9-f4d258772346\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918539 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/660a6ced-cc4a-4447-91ca-aa0b46e7c6ef-cert\") pod \"controller-7bb4cc7c98-zwr45\" (UID: \"660a6ced-cc4a-4447-91ca-aa0b46e7c6ef\") " pod="metallb-system/controller-7bb4cc7c98-zwr45" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918554 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b4594e-5e27-402a-ab35-ff01fd5392eb-metrics-certs\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918573 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-frr-conf\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918589 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd7zf\" (UniqueName: \"kubernetes.io/projected/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-kube-api-access-pd7zf\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918608 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5fqh\" (UniqueName: \"kubernetes.io/projected/f7b4594e-5e27-402a-ab35-ff01fd5392eb-kube-api-access-q5fqh\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.918806 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-metrics\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: E0318 14:17:09.918944 4756 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 18 14:17:09 crc kubenswrapper[4756]: E0318 14:17:09.918987 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4663e088-ab8a-4f74-b7e9-f4d258772346-cert podName:4663e088-ab8a-4f74-b7e9-f4d258772346 nodeName:}" failed. No retries permitted until 2026-03-18 14:17:10.418972511 +0000 UTC m=+1031.733390486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4663e088-ab8a-4f74-b7e9-f4d258772346-cert") pod "frr-k8s-webhook-server-bcc4b6f68-jpbxl" (UID: "4663e088-ab8a-4f74-b7e9-f4d258772346") : secret "frr-k8s-webhook-server-cert" not found Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.919052 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-frr-sockets\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.919298 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-reloader\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.919317 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-frr-conf\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.919891 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-frr-startup\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.936716 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-metrics-certs\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.940589 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd7zf\" (UniqueName: \"kubernetes.io/projected/abdfd85a-457d-4a04-bebb-4aae02b3a1ce-kube-api-access-pd7zf\") pod \"frr-k8s-fpdph\" (UID: \"abdfd85a-457d-4a04-bebb-4aae02b3a1ce\") " pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.940911 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69b86\" (UniqueName: \"kubernetes.io/projected/4663e088-ab8a-4f74-b7e9-f4d258772346-kube-api-access-69b86\") pod \"frr-k8s-webhook-server-bcc4b6f68-jpbxl\" (UID: \"4663e088-ab8a-4f74-b7e9-f4d258772346\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" Mar 18 14:17:09 crc kubenswrapper[4756]: I0318 14:17:09.947812 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.019728 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f7b4594e-5e27-402a-ab35-ff01fd5392eb-memberlist\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.019856 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/660a6ced-cc4a-4447-91ca-aa0b46e7c6ef-metrics-certs\") pod \"controller-7bb4cc7c98-zwr45\" (UID: \"660a6ced-cc4a-4447-91ca-aa0b46e7c6ef\") " pod="metallb-system/controller-7bb4cc7c98-zwr45" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.019888 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmr6p\" (UniqueName: \"kubernetes.io/projected/660a6ced-cc4a-4447-91ca-aa0b46e7c6ef-kube-api-access-tmr6p\") pod \"controller-7bb4cc7c98-zwr45\" (UID: \"660a6ced-cc4a-4447-91ca-aa0b46e7c6ef\") " pod="metallb-system/controller-7bb4cc7c98-zwr45" Mar 18 14:17:10 crc kubenswrapper[4756]: E0318 14:17:10.019942 4756 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 14:17:10 crc kubenswrapper[4756]: E0318 14:17:10.020016 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7b4594e-5e27-402a-ab35-ff01fd5392eb-memberlist podName:f7b4594e-5e27-402a-ab35-ff01fd5392eb nodeName:}" failed. No retries permitted until 2026-03-18 14:17:10.519996527 +0000 UTC m=+1031.834414512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f7b4594e-5e27-402a-ab35-ff01fd5392eb-memberlist") pod "speaker-5knbh" (UID: "f7b4594e-5e27-402a-ab35-ff01fd5392eb") : secret "metallb-memberlist" not found Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.019951 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/660a6ced-cc4a-4447-91ca-aa0b46e7c6ef-cert\") pod \"controller-7bb4cc7c98-zwr45\" (UID: \"660a6ced-cc4a-4447-91ca-aa0b46e7c6ef\") " pod="metallb-system/controller-7bb4cc7c98-zwr45" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.020082 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b4594e-5e27-402a-ab35-ff01fd5392eb-metrics-certs\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.020124 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5fqh\" (UniqueName: \"kubernetes.io/projected/f7b4594e-5e27-402a-ab35-ff01fd5392eb-kube-api-access-q5fqh\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.020146 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f7b4594e-5e27-402a-ab35-ff01fd5392eb-metallb-excludel2\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.021682 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f7b4594e-5e27-402a-ab35-ff01fd5392eb-metallb-excludel2\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.022261 4756 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.032704 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/660a6ced-cc4a-4447-91ca-aa0b46e7c6ef-metrics-certs\") pod \"controller-7bb4cc7c98-zwr45\" (UID: \"660a6ced-cc4a-4447-91ca-aa0b46e7c6ef\") " pod="metallb-system/controller-7bb4cc7c98-zwr45" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.032908 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b4594e-5e27-402a-ab35-ff01fd5392eb-metrics-certs\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.034948 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/660a6ced-cc4a-4447-91ca-aa0b46e7c6ef-cert\") pod \"controller-7bb4cc7c98-zwr45\" (UID: \"660a6ced-cc4a-4447-91ca-aa0b46e7c6ef\") " pod="metallb-system/controller-7bb4cc7c98-zwr45" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.036267 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5fqh\" (UniqueName: \"kubernetes.io/projected/f7b4594e-5e27-402a-ab35-ff01fd5392eb-kube-api-access-q5fqh\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.042660 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmr6p\" (UniqueName: \"kubernetes.io/projected/660a6ced-cc4a-4447-91ca-aa0b46e7c6ef-kube-api-access-tmr6p\") pod \"controller-7bb4cc7c98-zwr45\" (UID: \"660a6ced-cc4a-4447-91ca-aa0b46e7c6ef\") " pod="metallb-system/controller-7bb4cc7c98-zwr45" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.056894 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-zwr45" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.425705 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4663e088-ab8a-4f74-b7e9-f4d258772346-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jpbxl\" (UID: \"4663e088-ab8a-4f74-b7e9-f4d258772346\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.430559 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4663e088-ab8a-4f74-b7e9-f4d258772346-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jpbxl\" (UID: \"4663e088-ab8a-4f74-b7e9-f4d258772346\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.468335 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-zwr45"] Mar 18 14:17:10 crc kubenswrapper[4756]: W0318 14:17:10.474640 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod660a6ced_cc4a_4447_91ca_aa0b46e7c6ef.slice/crio-58b95afda7f266f4699a519d146480c395b5b8e893031a88d4d0967bc4c62da6 WatchSource:0}: Error finding container 58b95afda7f266f4699a519d146480c395b5b8e893031a88d4d0967bc4c62da6: Status 404 returned error can't find the container with id 58b95afda7f266f4699a519d146480c395b5b8e893031a88d4d0967bc4c62da6 Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.528320 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f7b4594e-5e27-402a-ab35-ff01fd5392eb-memberlist\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:10 crc kubenswrapper[4756]: E0318 14:17:10.528646 4756 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 14:17:10 crc kubenswrapper[4756]: E0318 14:17:10.528774 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7b4594e-5e27-402a-ab35-ff01fd5392eb-memberlist podName:f7b4594e-5e27-402a-ab35-ff01fd5392eb nodeName:}" failed. No retries permitted until 2026-03-18 14:17:11.528745726 +0000 UTC m=+1032.843163741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f7b4594e-5e27-402a-ab35-ff01fd5392eb-memberlist") pod "speaker-5knbh" (UID: "f7b4594e-5e27-402a-ab35-ff01fd5392eb") : secret "metallb-memberlist" not found Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.560482 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.667463 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lfxgb"] Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.668411 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lfxgb" podUID="46664941-8110-49b6-98ae-039dddc28118" containerName="registry-server" containerID="cri-o://9703e0dda2564bf6b988d9f5afa9bbefe81688d7a551862b9b65ae2c3410a927" gracePeriod=2 Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.798234 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl"] Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.858192 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" event={"ID":"4663e088-ab8a-4f74-b7e9-f4d258772346","Type":"ContainerStarted","Data":"d20ad958bf49675682c28ccae516b773e7d10e8958938f176a4ad70b9cbd8179"} Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.859845 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zwr45" event={"ID":"660a6ced-cc4a-4447-91ca-aa0b46e7c6ef","Type":"ContainerStarted","Data":"8339433a3857467dcb823ba572cbfae3ca06eb52522aaeb06d3ace055a07c03d"} Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.859888 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zwr45" event={"ID":"660a6ced-cc4a-4447-91ca-aa0b46e7c6ef","Type":"ContainerStarted","Data":"0d328ecfb3fc8e745fa7b0c6f762a8d3590370123ddbc409ccc8c5fbf2f419e3"} Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.859901 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zwr45" event={"ID":"660a6ced-cc4a-4447-91ca-aa0b46e7c6ef","Type":"ContainerStarted","Data":"58b95afda7f266f4699a519d146480c395b5b8e893031a88d4d0967bc4c62da6"} Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.859936 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-zwr45" Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.861557 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fpdph" event={"ID":"abdfd85a-457d-4a04-bebb-4aae02b3a1ce","Type":"ContainerStarted","Data":"87ce8e67574625e66a291f2ae0bba26fa0b16487f92943c8bc73b2e82e79421d"} Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.864804 4756 generic.go:334] "Generic (PLEG): container finished" podID="46664941-8110-49b6-98ae-039dddc28118" containerID="9703e0dda2564bf6b988d9f5afa9bbefe81688d7a551862b9b65ae2c3410a927" exitCode=0 Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.864843 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfxgb" event={"ID":"46664941-8110-49b6-98ae-039dddc28118","Type":"ContainerDied","Data":"9703e0dda2564bf6b988d9f5afa9bbefe81688d7a551862b9b65ae2c3410a927"} Mar 18 14:17:10 crc kubenswrapper[4756]: I0318 14:17:10.887773 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-zwr45" podStartSLOduration=1.8877499150000001 podStartE2EDuration="1.887749915s" podCreationTimestamp="2026-03-18 14:17:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:17:10.880367101 +0000 UTC m=+1032.194785086" watchObservedRunningTime="2026-03-18 14:17:10.887749915 +0000 UTC m=+1032.202167890" Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.030037 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.037668 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbx8q\" (UniqueName: \"kubernetes.io/projected/46664941-8110-49b6-98ae-039dddc28118-kube-api-access-jbx8q\") pod \"46664941-8110-49b6-98ae-039dddc28118\" (UID: \"46664941-8110-49b6-98ae-039dddc28118\") " Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.037741 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46664941-8110-49b6-98ae-039dddc28118-catalog-content\") pod \"46664941-8110-49b6-98ae-039dddc28118\" (UID: \"46664941-8110-49b6-98ae-039dddc28118\") " Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.037876 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46664941-8110-49b6-98ae-039dddc28118-utilities\") pod \"46664941-8110-49b6-98ae-039dddc28118\" (UID: \"46664941-8110-49b6-98ae-039dddc28118\") " Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.038802 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46664941-8110-49b6-98ae-039dddc28118-utilities" (OuterVolumeSpecName: "utilities") pod "46664941-8110-49b6-98ae-039dddc28118" (UID: "46664941-8110-49b6-98ae-039dddc28118"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.043566 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46664941-8110-49b6-98ae-039dddc28118-kube-api-access-jbx8q" (OuterVolumeSpecName: "kube-api-access-jbx8q") pod "46664941-8110-49b6-98ae-039dddc28118" (UID: "46664941-8110-49b6-98ae-039dddc28118"). InnerVolumeSpecName "kube-api-access-jbx8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.086571 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46664941-8110-49b6-98ae-039dddc28118-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46664941-8110-49b6-98ae-039dddc28118" (UID: "46664941-8110-49b6-98ae-039dddc28118"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.139898 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46664941-8110-49b6-98ae-039dddc28118-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.139930 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbx8q\" (UniqueName: \"kubernetes.io/projected/46664941-8110-49b6-98ae-039dddc28118-kube-api-access-jbx8q\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.139942 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46664941-8110-49b6-98ae-039dddc28118-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.543173 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f7b4594e-5e27-402a-ab35-ff01fd5392eb-memberlist\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.547961 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f7b4594e-5e27-402a-ab35-ff01fd5392eb-memberlist\") pod \"speaker-5knbh\" (UID: \"f7b4594e-5e27-402a-ab35-ff01fd5392eb\") " pod="metallb-system/speaker-5knbh" Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.835705 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5knbh" Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.878287 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfxgb" event={"ID":"46664941-8110-49b6-98ae-039dddc28118","Type":"ContainerDied","Data":"7cfb6402d3cfc7139e3aca6b9bf3453ea66fbd9c0dcba992430454752f1abd21"} Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.878329 4756 scope.go:117] "RemoveContainer" containerID="9703e0dda2564bf6b988d9f5afa9bbefe81688d7a551862b9b65ae2c3410a927" Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.878338 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfxgb" Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.918639 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lfxgb"] Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.923782 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lfxgb"] Mar 18 14:17:11 crc kubenswrapper[4756]: I0318 14:17:11.955660 4756 scope.go:117] "RemoveContainer" containerID="4f0470292ee752105fb0ac4205a45d0db7a5fc5829caabdf2db0a44cc13a4178" Mar 18 14:17:12 crc kubenswrapper[4756]: I0318 14:17:12.011244 4756 scope.go:117] "RemoveContainer" containerID="05c6fb9bae89682621cbbb5da88cac0bf9925f07922be76d5dd1983a14683d7d" Mar 18 14:17:12 crc kubenswrapper[4756]: I0318 14:17:12.889075 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5knbh" event={"ID":"f7b4594e-5e27-402a-ab35-ff01fd5392eb","Type":"ContainerStarted","Data":"3df172da3e0163469600886893c229c6cb823b9234f47a7e53448f782fd5ec37"} Mar 18 14:17:12 crc kubenswrapper[4756]: I0318 14:17:12.889411 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5knbh" event={"ID":"f7b4594e-5e27-402a-ab35-ff01fd5392eb","Type":"ContainerStarted","Data":"b6ff45a19945224ba8defc25e0d5f30b499b59b04c83bf7e4b46422e070e4033"} Mar 18 14:17:12 crc kubenswrapper[4756]: I0318 14:17:12.889427 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5knbh" event={"ID":"f7b4594e-5e27-402a-ab35-ff01fd5392eb","Type":"ContainerStarted","Data":"c730ef1b23a06e9bf405cc87589a2d5adba9f4c6b79656d697ce4441f9549bf9"} Mar 18 14:17:12 crc kubenswrapper[4756]: I0318 14:17:12.889582 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-5knbh" Mar 18 14:17:12 crc kubenswrapper[4756]: I0318 14:17:12.908832 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5knbh" podStartSLOduration=3.9088019750000003 podStartE2EDuration="3.908801975s" podCreationTimestamp="2026-03-18 14:17:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:17:12.905827583 +0000 UTC m=+1034.220245558" watchObservedRunningTime="2026-03-18 14:17:12.908801975 +0000 UTC m=+1034.223219950" Mar 18 14:17:13 crc kubenswrapper[4756]: I0318 14:17:13.324669 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46664941-8110-49b6-98ae-039dddc28118" path="/var/lib/kubelet/pods/46664941-8110-49b6-98ae-039dddc28118/volumes" Mar 18 14:17:17 crc kubenswrapper[4756]: I0318 14:17:17.956957 4756 generic.go:334] "Generic (PLEG): container finished" podID="abdfd85a-457d-4a04-bebb-4aae02b3a1ce" containerID="cc2274c68b85abc80b6c1fadc3e68a5401ebd4673a297710eaeb6628cc96308e" exitCode=0 Mar 18 14:17:17 crc kubenswrapper[4756]: I0318 14:17:17.957053 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fpdph" event={"ID":"abdfd85a-457d-4a04-bebb-4aae02b3a1ce","Type":"ContainerDied","Data":"cc2274c68b85abc80b6c1fadc3e68a5401ebd4673a297710eaeb6628cc96308e"} Mar 18 14:17:17 crc kubenswrapper[4756]: I0318 14:17:17.961006 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" event={"ID":"4663e088-ab8a-4f74-b7e9-f4d258772346","Type":"ContainerStarted","Data":"1a16a03ff833a391586876d479274cbdcb1e0db69d10d95ebdd3c1dc331ad80b"} Mar 18 14:17:17 crc kubenswrapper[4756]: I0318 14:17:17.961520 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" Mar 18 14:17:18 crc kubenswrapper[4756]: I0318 14:17:18.012551 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" podStartSLOduration=2.663919898 podStartE2EDuration="9.012517347s" podCreationTimestamp="2026-03-18 14:17:09 +0000 UTC" firstStartedPulling="2026-03-18 14:17:10.800012236 +0000 UTC m=+1032.114430221" lastFinishedPulling="2026-03-18 14:17:17.148609675 +0000 UTC m=+1038.463027670" observedRunningTime="2026-03-18 14:17:18.00284247 +0000 UTC m=+1039.317260495" watchObservedRunningTime="2026-03-18 14:17:18.012517347 +0000 UTC m=+1039.326935352" Mar 18 14:17:18 crc kubenswrapper[4756]: I0318 14:17:18.967267 4756 generic.go:334] "Generic (PLEG): container finished" podID="abdfd85a-457d-4a04-bebb-4aae02b3a1ce" containerID="638c7de9cefc0bb2c2ee4e7e862e49cd64e5b1ad8e8081b9b7735a05a078469d" exitCode=0 Mar 18 14:17:18 crc kubenswrapper[4756]: I0318 14:17:18.967338 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fpdph" event={"ID":"abdfd85a-457d-4a04-bebb-4aae02b3a1ce","Type":"ContainerDied","Data":"638c7de9cefc0bb2c2ee4e7e862e49cd64e5b1ad8e8081b9b7735a05a078469d"} Mar 18 14:17:19 crc kubenswrapper[4756]: I0318 14:17:19.978226 4756 generic.go:334] "Generic (PLEG): container finished" podID="abdfd85a-457d-4a04-bebb-4aae02b3a1ce" containerID="c3b10b6e832d0b92210c90a7888c50aefb2f038c95726d06777307b74b6d3237" exitCode=0 Mar 18 14:17:19 crc kubenswrapper[4756]: I0318 14:17:19.978280 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fpdph" event={"ID":"abdfd85a-457d-4a04-bebb-4aae02b3a1ce","Type":"ContainerDied","Data":"c3b10b6e832d0b92210c90a7888c50aefb2f038c95726d06777307b74b6d3237"} Mar 18 14:17:20 crc kubenswrapper[4756]: I0318 14:17:20.064809 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-zwr45" Mar 18 14:17:20 crc kubenswrapper[4756]: I0318 14:17:20.994402 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fpdph" event={"ID":"abdfd85a-457d-4a04-bebb-4aae02b3a1ce","Type":"ContainerStarted","Data":"cb7839e306cf999c2eb1c57579db08d70e68b19b53298384d1416d4fc3d9e387"} Mar 18 14:17:20 crc kubenswrapper[4756]: I0318 14:17:20.994447 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fpdph" event={"ID":"abdfd85a-457d-4a04-bebb-4aae02b3a1ce","Type":"ContainerStarted","Data":"baa9a442541e2e54eec622938101b95db54390d85555a6a70bcc52a15aa3ca20"} Mar 18 14:17:20 crc kubenswrapper[4756]: I0318 14:17:20.994460 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fpdph" event={"ID":"abdfd85a-457d-4a04-bebb-4aae02b3a1ce","Type":"ContainerStarted","Data":"3ada704ac554b295dca607fbe5c9a019c44e460d1b52ba91b0f0f51e04aaf2c1"} Mar 18 14:17:20 crc kubenswrapper[4756]: I0318 14:17:20.994471 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fpdph" event={"ID":"abdfd85a-457d-4a04-bebb-4aae02b3a1ce","Type":"ContainerStarted","Data":"30bb0edc8203795e00713edd8c610299a76424f91f28f1da1c304c0ead52d751"} Mar 18 14:17:20 crc kubenswrapper[4756]: I0318 14:17:20.994481 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fpdph" event={"ID":"abdfd85a-457d-4a04-bebb-4aae02b3a1ce","Type":"ContainerStarted","Data":"521abffb1188c481075198b4259478fa3bb72940da2561ececd3fa1d32ba25cf"} Mar 18 14:17:20 crc kubenswrapper[4756]: I0318 14:17:20.994492 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fpdph" event={"ID":"abdfd85a-457d-4a04-bebb-4aae02b3a1ce","Type":"ContainerStarted","Data":"d8f6e3381e15b68a60d7ee26ea55342d16c798dfdde0a798f3ccbfe264a8ded3"} Mar 18 14:17:20 crc kubenswrapper[4756]: I0318 14:17:20.994865 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:21 crc kubenswrapper[4756]: I0318 14:17:21.035000 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-fpdph" podStartSLOduration=5.067812634 podStartE2EDuration="12.03498142s" podCreationTimestamp="2026-03-18 14:17:09 +0000 UTC" firstStartedPulling="2026-03-18 14:17:10.165723485 +0000 UTC m=+1031.480141460" lastFinishedPulling="2026-03-18 14:17:17.132892261 +0000 UTC m=+1038.447310246" observedRunningTime="2026-03-18 14:17:21.030474206 +0000 UTC m=+1042.344892211" watchObservedRunningTime="2026-03-18 14:17:21.03498142 +0000 UTC m=+1042.349399405" Mar 18 14:17:24 crc kubenswrapper[4756]: I0318 14:17:24.948080 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:25 crc kubenswrapper[4756]: I0318 14:17:25.042654 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:30 crc kubenswrapper[4756]: I0318 14:17:30.566288 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jpbxl" Mar 18 14:17:31 crc kubenswrapper[4756]: I0318 14:17:31.841063 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5knbh" Mar 18 14:17:33 crc kubenswrapper[4756]: I0318 14:17:33.882145 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rtl6b"] Mar 18 14:17:33 crc kubenswrapper[4756]: E0318 14:17:33.882668 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46664941-8110-49b6-98ae-039dddc28118" containerName="registry-server" Mar 18 14:17:33 crc kubenswrapper[4756]: I0318 14:17:33.882684 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="46664941-8110-49b6-98ae-039dddc28118" containerName="registry-server" Mar 18 14:17:33 crc kubenswrapper[4756]: E0318 14:17:33.882703 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46664941-8110-49b6-98ae-039dddc28118" containerName="extract-content" Mar 18 14:17:33 crc kubenswrapper[4756]: I0318 14:17:33.882712 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="46664941-8110-49b6-98ae-039dddc28118" containerName="extract-content" Mar 18 14:17:33 crc kubenswrapper[4756]: E0318 14:17:33.882733 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46664941-8110-49b6-98ae-039dddc28118" containerName="extract-utilities" Mar 18 14:17:33 crc kubenswrapper[4756]: I0318 14:17:33.882743 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="46664941-8110-49b6-98ae-039dddc28118" containerName="extract-utilities" Mar 18 14:17:33 crc kubenswrapper[4756]: I0318 14:17:33.882887 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="46664941-8110-49b6-98ae-039dddc28118" containerName="registry-server" Mar 18 14:17:33 crc kubenswrapper[4756]: I0318 14:17:33.883922 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:33 crc kubenswrapper[4756]: I0318 14:17:33.910468 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rtl6b"] Mar 18 14:17:33 crc kubenswrapper[4756]: I0318 14:17:33.977502 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w4b9\" (UniqueName: \"kubernetes.io/projected/9fd77897-bf99-4e01-9902-42dbd611eb23-kube-api-access-2w4b9\") pod \"certified-operators-rtl6b\" (UID: \"9fd77897-bf99-4e01-9902-42dbd611eb23\") " pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:33 crc kubenswrapper[4756]: I0318 14:17:33.977572 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fd77897-bf99-4e01-9902-42dbd611eb23-catalog-content\") pod \"certified-operators-rtl6b\" (UID: \"9fd77897-bf99-4e01-9902-42dbd611eb23\") " pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:33 crc kubenswrapper[4756]: I0318 14:17:33.977629 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fd77897-bf99-4e01-9902-42dbd611eb23-utilities\") pod \"certified-operators-rtl6b\" (UID: \"9fd77897-bf99-4e01-9902-42dbd611eb23\") " pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:34 crc kubenswrapper[4756]: I0318 14:17:34.079256 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w4b9\" (UniqueName: \"kubernetes.io/projected/9fd77897-bf99-4e01-9902-42dbd611eb23-kube-api-access-2w4b9\") pod \"certified-operators-rtl6b\" (UID: \"9fd77897-bf99-4e01-9902-42dbd611eb23\") " pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:34 crc kubenswrapper[4756]: I0318 14:17:34.079337 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fd77897-bf99-4e01-9902-42dbd611eb23-catalog-content\") pod \"certified-operators-rtl6b\" (UID: \"9fd77897-bf99-4e01-9902-42dbd611eb23\") " pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:34 crc kubenswrapper[4756]: I0318 14:17:34.079364 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fd77897-bf99-4e01-9902-42dbd611eb23-utilities\") pod \"certified-operators-rtl6b\" (UID: \"9fd77897-bf99-4e01-9902-42dbd611eb23\") " pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:34 crc kubenswrapper[4756]: I0318 14:17:34.079866 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fd77897-bf99-4e01-9902-42dbd611eb23-catalog-content\") pod \"certified-operators-rtl6b\" (UID: \"9fd77897-bf99-4e01-9902-42dbd611eb23\") " pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:34 crc kubenswrapper[4756]: I0318 14:17:34.079947 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fd77897-bf99-4e01-9902-42dbd611eb23-utilities\") pod \"certified-operators-rtl6b\" (UID: \"9fd77897-bf99-4e01-9902-42dbd611eb23\") " pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:34 crc kubenswrapper[4756]: I0318 14:17:34.099109 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w4b9\" (UniqueName: \"kubernetes.io/projected/9fd77897-bf99-4e01-9902-42dbd611eb23-kube-api-access-2w4b9\") pod \"certified-operators-rtl6b\" (UID: \"9fd77897-bf99-4e01-9902-42dbd611eb23\") " pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:34 crc kubenswrapper[4756]: I0318 14:17:34.211328 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:34 crc kubenswrapper[4756]: I0318 14:17:34.500511 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rtl6b"] Mar 18 14:17:34 crc kubenswrapper[4756]: W0318 14:17:34.517951 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fd77897_bf99_4e01_9902_42dbd611eb23.slice/crio-9439d2532a65c94edf04213d44e464844fff482d38efd78932115d2912c2f297 WatchSource:0}: Error finding container 9439d2532a65c94edf04213d44e464844fff482d38efd78932115d2912c2f297: Status 404 returned error can't find the container with id 9439d2532a65c94edf04213d44e464844fff482d38efd78932115d2912c2f297 Mar 18 14:17:35 crc kubenswrapper[4756]: I0318 14:17:35.105061 4756 generic.go:334] "Generic (PLEG): container finished" podID="9fd77897-bf99-4e01-9902-42dbd611eb23" containerID="1773215a9d64ad4f2a741685dae7715bcb6aa24310d0891e5dd1ffa020fd545c" exitCode=0 Mar 18 14:17:35 crc kubenswrapper[4756]: I0318 14:17:35.105178 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rtl6b" event={"ID":"9fd77897-bf99-4e01-9902-42dbd611eb23","Type":"ContainerDied","Data":"1773215a9d64ad4f2a741685dae7715bcb6aa24310d0891e5dd1ffa020fd545c"} Mar 18 14:17:35 crc kubenswrapper[4756]: I0318 14:17:35.105404 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rtl6b" event={"ID":"9fd77897-bf99-4e01-9902-42dbd611eb23","Type":"ContainerStarted","Data":"9439d2532a65c94edf04213d44e464844fff482d38efd78932115d2912c2f297"} Mar 18 14:17:37 crc kubenswrapper[4756]: I0318 14:17:37.132439 4756 generic.go:334] "Generic (PLEG): container finished" podID="9fd77897-bf99-4e01-9902-42dbd611eb23" containerID="b7cb49a267b2e083d0308cc4466a96b91a4bea8b14a675d0755abe618fc0a1c3" exitCode=0 Mar 18 14:17:37 crc kubenswrapper[4756]: I0318 14:17:37.132538 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rtl6b" event={"ID":"9fd77897-bf99-4e01-9902-42dbd611eb23","Type":"ContainerDied","Data":"b7cb49a267b2e083d0308cc4466a96b91a4bea8b14a675d0755abe618fc0a1c3"} Mar 18 14:17:38 crc kubenswrapper[4756]: I0318 14:17:38.141718 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rtl6b" event={"ID":"9fd77897-bf99-4e01-9902-42dbd611eb23","Type":"ContainerStarted","Data":"684b59eedd17edca9713e0d9a4638f0874d935d2630becd7826d7a9ee63c9f8b"} Mar 18 14:17:38 crc kubenswrapper[4756]: I0318 14:17:38.157762 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rtl6b" podStartSLOduration=2.68908104 podStartE2EDuration="5.157745543s" podCreationTimestamp="2026-03-18 14:17:33 +0000 UTC" firstStartedPulling="2026-03-18 14:17:35.10666681 +0000 UTC m=+1056.421084785" lastFinishedPulling="2026-03-18 14:17:37.575331313 +0000 UTC m=+1058.889749288" observedRunningTime="2026-03-18 14:17:38.157474194 +0000 UTC m=+1059.471892199" watchObservedRunningTime="2026-03-18 14:17:38.157745543 +0000 UTC m=+1059.472163508" Mar 18 14:17:39 crc kubenswrapper[4756]: I0318 14:17:39.080624 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-j7ksc"] Mar 18 14:17:39 crc kubenswrapper[4756]: I0318 14:17:39.081956 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j7ksc" Mar 18 14:17:39 crc kubenswrapper[4756]: I0318 14:17:39.084801 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 14:17:39 crc kubenswrapper[4756]: I0318 14:17:39.086099 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6c84f" Mar 18 14:17:39 crc kubenswrapper[4756]: I0318 14:17:39.086508 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 14:17:39 crc kubenswrapper[4756]: I0318 14:17:39.093529 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-j7ksc"] Mar 18 14:17:39 crc kubenswrapper[4756]: I0318 14:17:39.257152 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvhqv\" (UniqueName: \"kubernetes.io/projected/6436d7cb-143a-4d2a-8fb7-9288ca809679-kube-api-access-qvhqv\") pod \"openstack-operator-index-j7ksc\" (UID: \"6436d7cb-143a-4d2a-8fb7-9288ca809679\") " pod="openstack-operators/openstack-operator-index-j7ksc" Mar 18 14:17:39 crc kubenswrapper[4756]: I0318 14:17:39.357899 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvhqv\" (UniqueName: \"kubernetes.io/projected/6436d7cb-143a-4d2a-8fb7-9288ca809679-kube-api-access-qvhqv\") pod \"openstack-operator-index-j7ksc\" (UID: \"6436d7cb-143a-4d2a-8fb7-9288ca809679\") " pod="openstack-operators/openstack-operator-index-j7ksc" Mar 18 14:17:39 crc kubenswrapper[4756]: I0318 14:17:39.384151 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvhqv\" (UniqueName: \"kubernetes.io/projected/6436d7cb-143a-4d2a-8fb7-9288ca809679-kube-api-access-qvhqv\") pod \"openstack-operator-index-j7ksc\" (UID: \"6436d7cb-143a-4d2a-8fb7-9288ca809679\") " pod="openstack-operators/openstack-operator-index-j7ksc" Mar 18 14:17:39 crc kubenswrapper[4756]: I0318 14:17:39.404551 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j7ksc" Mar 18 14:17:39 crc kubenswrapper[4756]: I0318 14:17:39.831147 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-j7ksc"] Mar 18 14:17:39 crc kubenswrapper[4756]: W0318 14:17:39.838811 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6436d7cb_143a_4d2a_8fb7_9288ca809679.slice/crio-f7255aa6b778e82818b2ab2a63da82a36e303a022761e190cf5c4a64bb3716cb WatchSource:0}: Error finding container f7255aa6b778e82818b2ab2a63da82a36e303a022761e190cf5c4a64bb3716cb: Status 404 returned error can't find the container with id f7255aa6b778e82818b2ab2a63da82a36e303a022761e190cf5c4a64bb3716cb Mar 18 14:17:39 crc kubenswrapper[4756]: I0318 14:17:39.950917 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-fpdph" Mar 18 14:17:40 crc kubenswrapper[4756]: I0318 14:17:40.184933 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j7ksc" event={"ID":"6436d7cb-143a-4d2a-8fb7-9288ca809679","Type":"ContainerStarted","Data":"f7255aa6b778e82818b2ab2a63da82a36e303a022761e190cf5c4a64bb3716cb"} Mar 18 14:17:43 crc kubenswrapper[4756]: I0318 14:17:43.205732 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j7ksc" event={"ID":"6436d7cb-143a-4d2a-8fb7-9288ca809679","Type":"ContainerStarted","Data":"8a352d157ea9cb67e242dd8e7e43d9d043fece94fab7b927a044886ba21dc84f"} Mar 18 14:17:43 crc kubenswrapper[4756]: I0318 14:17:43.222882 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-j7ksc" podStartSLOduration=1.719632496 podStartE2EDuration="4.222865012s" podCreationTimestamp="2026-03-18 14:17:39 +0000 UTC" firstStartedPulling="2026-03-18 14:17:39.841671746 +0000 UTC m=+1061.156089721" lastFinishedPulling="2026-03-18 14:17:42.344904222 +0000 UTC m=+1063.659322237" observedRunningTime="2026-03-18 14:17:43.220200698 +0000 UTC m=+1064.534618693" watchObservedRunningTime="2026-03-18 14:17:43.222865012 +0000 UTC m=+1064.537282987" Mar 18 14:17:44 crc kubenswrapper[4756]: I0318 14:17:44.211472 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:44 crc kubenswrapper[4756]: I0318 14:17:44.212000 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:44 crc kubenswrapper[4756]: I0318 14:17:44.286350 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:45 crc kubenswrapper[4756]: I0318 14:17:45.294531 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:47 crc kubenswrapper[4756]: I0318 14:17:47.867698 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rtl6b"] Mar 18 14:17:48 crc kubenswrapper[4756]: I0318 14:17:48.248681 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rtl6b" podUID="9fd77897-bf99-4e01-9902-42dbd611eb23" containerName="registry-server" containerID="cri-o://684b59eedd17edca9713e0d9a4638f0874d935d2630becd7826d7a9ee63c9f8b" gracePeriod=2 Mar 18 14:17:48 crc kubenswrapper[4756]: I0318 14:17:48.736215 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:48 crc kubenswrapper[4756]: I0318 14:17:48.906867 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fd77897-bf99-4e01-9902-42dbd611eb23-utilities\") pod \"9fd77897-bf99-4e01-9902-42dbd611eb23\" (UID: \"9fd77897-bf99-4e01-9902-42dbd611eb23\") " Mar 18 14:17:48 crc kubenswrapper[4756]: I0318 14:17:48.906950 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fd77897-bf99-4e01-9902-42dbd611eb23-catalog-content\") pod \"9fd77897-bf99-4e01-9902-42dbd611eb23\" (UID: \"9fd77897-bf99-4e01-9902-42dbd611eb23\") " Mar 18 14:17:48 crc kubenswrapper[4756]: I0318 14:17:48.907009 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w4b9\" (UniqueName: \"kubernetes.io/projected/9fd77897-bf99-4e01-9902-42dbd611eb23-kube-api-access-2w4b9\") pod \"9fd77897-bf99-4e01-9902-42dbd611eb23\" (UID: \"9fd77897-bf99-4e01-9902-42dbd611eb23\") " Mar 18 14:17:48 crc kubenswrapper[4756]: I0318 14:17:48.907867 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fd77897-bf99-4e01-9902-42dbd611eb23-utilities" (OuterVolumeSpecName: "utilities") pod "9fd77897-bf99-4e01-9902-42dbd611eb23" (UID: "9fd77897-bf99-4e01-9902-42dbd611eb23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:17:48 crc kubenswrapper[4756]: I0318 14:17:48.913291 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd77897-bf99-4e01-9902-42dbd611eb23-kube-api-access-2w4b9" (OuterVolumeSpecName: "kube-api-access-2w4b9") pod "9fd77897-bf99-4e01-9902-42dbd611eb23" (UID: "9fd77897-bf99-4e01-9902-42dbd611eb23"). InnerVolumeSpecName "kube-api-access-2w4b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:17:48 crc kubenswrapper[4756]: I0318 14:17:48.968173 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fd77897-bf99-4e01-9902-42dbd611eb23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fd77897-bf99-4e01-9902-42dbd611eb23" (UID: "9fd77897-bf99-4e01-9902-42dbd611eb23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.008310 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w4b9\" (UniqueName: \"kubernetes.io/projected/9fd77897-bf99-4e01-9902-42dbd611eb23-kube-api-access-2w4b9\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.008338 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fd77897-bf99-4e01-9902-42dbd611eb23-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.008347 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fd77897-bf99-4e01-9902-42dbd611eb23-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.261188 4756 generic.go:334] "Generic (PLEG): container finished" podID="9fd77897-bf99-4e01-9902-42dbd611eb23" containerID="684b59eedd17edca9713e0d9a4638f0874d935d2630becd7826d7a9ee63c9f8b" exitCode=0 Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.261233 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rtl6b" event={"ID":"9fd77897-bf99-4e01-9902-42dbd611eb23","Type":"ContainerDied","Data":"684b59eedd17edca9713e0d9a4638f0874d935d2630becd7826d7a9ee63c9f8b"} Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.261262 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rtl6b" event={"ID":"9fd77897-bf99-4e01-9902-42dbd611eb23","Type":"ContainerDied","Data":"9439d2532a65c94edf04213d44e464844fff482d38efd78932115d2912c2f297"} Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.261283 4756 scope.go:117] "RemoveContainer" containerID="684b59eedd17edca9713e0d9a4638f0874d935d2630becd7826d7a9ee63c9f8b" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.261411 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rtl6b" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.284101 4756 scope.go:117] "RemoveContainer" containerID="b7cb49a267b2e083d0308cc4466a96b91a4bea8b14a675d0755abe618fc0a1c3" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.299486 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rtl6b"] Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.309750 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rtl6b"] Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.319816 4756 scope.go:117] "RemoveContainer" containerID="1773215a9d64ad4f2a741685dae7715bcb6aa24310d0891e5dd1ffa020fd545c" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.337748 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fd77897-bf99-4e01-9902-42dbd611eb23" path="/var/lib/kubelet/pods/9fd77897-bf99-4e01-9902-42dbd611eb23/volumes" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.344352 4756 scope.go:117] "RemoveContainer" containerID="684b59eedd17edca9713e0d9a4638f0874d935d2630becd7826d7a9ee63c9f8b" Mar 18 14:17:49 crc kubenswrapper[4756]: E0318 14:17:49.344770 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"684b59eedd17edca9713e0d9a4638f0874d935d2630becd7826d7a9ee63c9f8b\": container with ID starting with 684b59eedd17edca9713e0d9a4638f0874d935d2630becd7826d7a9ee63c9f8b not found: ID does not exist" containerID="684b59eedd17edca9713e0d9a4638f0874d935d2630becd7826d7a9ee63c9f8b" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.344818 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684b59eedd17edca9713e0d9a4638f0874d935d2630becd7826d7a9ee63c9f8b"} err="failed to get container status \"684b59eedd17edca9713e0d9a4638f0874d935d2630becd7826d7a9ee63c9f8b\": rpc error: code = NotFound desc = could not find container \"684b59eedd17edca9713e0d9a4638f0874d935d2630becd7826d7a9ee63c9f8b\": container with ID starting with 684b59eedd17edca9713e0d9a4638f0874d935d2630becd7826d7a9ee63c9f8b not found: ID does not exist" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.344846 4756 scope.go:117] "RemoveContainer" containerID="b7cb49a267b2e083d0308cc4466a96b91a4bea8b14a675d0755abe618fc0a1c3" Mar 18 14:17:49 crc kubenswrapper[4756]: E0318 14:17:49.345222 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7cb49a267b2e083d0308cc4466a96b91a4bea8b14a675d0755abe618fc0a1c3\": container with ID starting with b7cb49a267b2e083d0308cc4466a96b91a4bea8b14a675d0755abe618fc0a1c3 not found: ID does not exist" containerID="b7cb49a267b2e083d0308cc4466a96b91a4bea8b14a675d0755abe618fc0a1c3" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.345251 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7cb49a267b2e083d0308cc4466a96b91a4bea8b14a675d0755abe618fc0a1c3"} err="failed to get container status \"b7cb49a267b2e083d0308cc4466a96b91a4bea8b14a675d0755abe618fc0a1c3\": rpc error: code = NotFound desc = could not find container \"b7cb49a267b2e083d0308cc4466a96b91a4bea8b14a675d0755abe618fc0a1c3\": container with ID starting with b7cb49a267b2e083d0308cc4466a96b91a4bea8b14a675d0755abe618fc0a1c3 not found: ID does not exist" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.345268 4756 scope.go:117] "RemoveContainer" containerID="1773215a9d64ad4f2a741685dae7715bcb6aa24310d0891e5dd1ffa020fd545c" Mar 18 14:17:49 crc kubenswrapper[4756]: E0318 14:17:49.345532 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1773215a9d64ad4f2a741685dae7715bcb6aa24310d0891e5dd1ffa020fd545c\": container with ID starting with 1773215a9d64ad4f2a741685dae7715bcb6aa24310d0891e5dd1ffa020fd545c not found: ID does not exist" containerID="1773215a9d64ad4f2a741685dae7715bcb6aa24310d0891e5dd1ffa020fd545c" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.345560 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1773215a9d64ad4f2a741685dae7715bcb6aa24310d0891e5dd1ffa020fd545c"} err="failed to get container status \"1773215a9d64ad4f2a741685dae7715bcb6aa24310d0891e5dd1ffa020fd545c\": rpc error: code = NotFound desc = could not find container \"1773215a9d64ad4f2a741685dae7715bcb6aa24310d0891e5dd1ffa020fd545c\": container with ID starting with 1773215a9d64ad4f2a741685dae7715bcb6aa24310d0891e5dd1ffa020fd545c not found: ID does not exist" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.405661 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-j7ksc" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.405709 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-j7ksc" Mar 18 14:17:49 crc kubenswrapper[4756]: I0318 14:17:49.441904 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-j7ksc" Mar 18 14:17:50 crc kubenswrapper[4756]: I0318 14:17:50.297798 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-j7ksc" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.122981 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4"] Mar 18 14:17:53 crc kubenswrapper[4756]: E0318 14:17:53.123840 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd77897-bf99-4e01-9902-42dbd611eb23" containerName="registry-server" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.123856 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd77897-bf99-4e01-9902-42dbd611eb23" containerName="registry-server" Mar 18 14:17:53 crc kubenswrapper[4756]: E0318 14:17:53.123879 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd77897-bf99-4e01-9902-42dbd611eb23" containerName="extract-content" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.123888 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd77897-bf99-4e01-9902-42dbd611eb23" containerName="extract-content" Mar 18 14:17:53 crc kubenswrapper[4756]: E0318 14:17:53.123901 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd77897-bf99-4e01-9902-42dbd611eb23" containerName="extract-utilities" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.123911 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd77897-bf99-4e01-9902-42dbd611eb23" containerName="extract-utilities" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.124083 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd77897-bf99-4e01-9902-42dbd611eb23" containerName="registry-server" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.125177 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.127523 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-9zh9h" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.143856 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4"] Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.163680 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfbgd\" (UniqueName: \"kubernetes.io/projected/869373d9-b980-4c6a-80aa-3ab7a2e046a2-kube-api-access-lfbgd\") pod \"484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4\" (UID: \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\") " pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.163874 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/869373d9-b980-4c6a-80aa-3ab7a2e046a2-bundle\") pod \"484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4\" (UID: \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\") " pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.163983 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/869373d9-b980-4c6a-80aa-3ab7a2e046a2-util\") pod \"484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4\" (UID: \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\") " pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.267759 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/869373d9-b980-4c6a-80aa-3ab7a2e046a2-bundle\") pod \"484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4\" (UID: \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\") " pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.267818 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/869373d9-b980-4c6a-80aa-3ab7a2e046a2-util\") pod \"484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4\" (UID: \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\") " pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.267888 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfbgd\" (UniqueName: \"kubernetes.io/projected/869373d9-b980-4c6a-80aa-3ab7a2e046a2-kube-api-access-lfbgd\") pod \"484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4\" (UID: \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\") " pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.268719 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/869373d9-b980-4c6a-80aa-3ab7a2e046a2-bundle\") pod \"484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4\" (UID: \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\") " pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.268939 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/869373d9-b980-4c6a-80aa-3ab7a2e046a2-util\") pod \"484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4\" (UID: \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\") " pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.296467 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfbgd\" (UniqueName: \"kubernetes.io/projected/869373d9-b980-4c6a-80aa-3ab7a2e046a2-kube-api-access-lfbgd\") pod \"484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4\" (UID: \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\") " pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.469887 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" Mar 18 14:17:53 crc kubenswrapper[4756]: I0318 14:17:53.906748 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4"] Mar 18 14:17:54 crc kubenswrapper[4756]: I0318 14:17:54.300616 4756 generic.go:334] "Generic (PLEG): container finished" podID="869373d9-b980-4c6a-80aa-3ab7a2e046a2" containerID="51a88fa2efadd37424191cb6ff9f4f8a86409def20baa35f67dfcb9e0c1f4acd" exitCode=0 Mar 18 14:17:54 crc kubenswrapper[4756]: I0318 14:17:54.300666 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" event={"ID":"869373d9-b980-4c6a-80aa-3ab7a2e046a2","Type":"ContainerDied","Data":"51a88fa2efadd37424191cb6ff9f4f8a86409def20baa35f67dfcb9e0c1f4acd"} Mar 18 14:17:54 crc kubenswrapper[4756]: I0318 14:17:54.300700 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" event={"ID":"869373d9-b980-4c6a-80aa-3ab7a2e046a2","Type":"ContainerStarted","Data":"3bb875ed38209c6a2449a024aed60232969a977de51e72d4c894e90bf7425583"} Mar 18 14:17:55 crc kubenswrapper[4756]: I0318 14:17:55.309827 4756 generic.go:334] "Generic (PLEG): container finished" podID="869373d9-b980-4c6a-80aa-3ab7a2e046a2" containerID="b541e17d10d9b8172df6e3c876058d8e977954e6e185d8502e1dbe0d8bd0ebfb" exitCode=0 Mar 18 14:17:55 crc kubenswrapper[4756]: I0318 14:17:55.309953 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" event={"ID":"869373d9-b980-4c6a-80aa-3ab7a2e046a2","Type":"ContainerDied","Data":"b541e17d10d9b8172df6e3c876058d8e977954e6e185d8502e1dbe0d8bd0ebfb"} Mar 18 14:17:56 crc kubenswrapper[4756]: I0318 14:17:56.322070 4756 generic.go:334] "Generic (PLEG): container finished" podID="869373d9-b980-4c6a-80aa-3ab7a2e046a2" containerID="6eedc02d4a526810a727fa279ed6f70026955e12b5914f8fe1d9854edd51a108" exitCode=0 Mar 18 14:17:56 crc kubenswrapper[4756]: I0318 14:17:56.322521 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" event={"ID":"869373d9-b980-4c6a-80aa-3ab7a2e046a2","Type":"ContainerDied","Data":"6eedc02d4a526810a727fa279ed6f70026955e12b5914f8fe1d9854edd51a108"} Mar 18 14:17:57 crc kubenswrapper[4756]: I0318 14:17:57.630438 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" Mar 18 14:17:57 crc kubenswrapper[4756]: I0318 14:17:57.831140 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfbgd\" (UniqueName: \"kubernetes.io/projected/869373d9-b980-4c6a-80aa-3ab7a2e046a2-kube-api-access-lfbgd\") pod \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\" (UID: \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\") " Mar 18 14:17:57 crc kubenswrapper[4756]: I0318 14:17:57.831405 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/869373d9-b980-4c6a-80aa-3ab7a2e046a2-util\") pod \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\" (UID: \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\") " Mar 18 14:17:57 crc kubenswrapper[4756]: I0318 14:17:57.831533 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/869373d9-b980-4c6a-80aa-3ab7a2e046a2-bundle\") pod \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\" (UID: \"869373d9-b980-4c6a-80aa-3ab7a2e046a2\") " Mar 18 14:17:57 crc kubenswrapper[4756]: I0318 14:17:57.832473 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/869373d9-b980-4c6a-80aa-3ab7a2e046a2-bundle" (OuterVolumeSpecName: "bundle") pod "869373d9-b980-4c6a-80aa-3ab7a2e046a2" (UID: "869373d9-b980-4c6a-80aa-3ab7a2e046a2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:17:57 crc kubenswrapper[4756]: I0318 14:17:57.833023 4756 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/869373d9-b980-4c6a-80aa-3ab7a2e046a2-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:57 crc kubenswrapper[4756]: I0318 14:17:57.839670 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869373d9-b980-4c6a-80aa-3ab7a2e046a2-kube-api-access-lfbgd" (OuterVolumeSpecName: "kube-api-access-lfbgd") pod "869373d9-b980-4c6a-80aa-3ab7a2e046a2" (UID: "869373d9-b980-4c6a-80aa-3ab7a2e046a2"). InnerVolumeSpecName "kube-api-access-lfbgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:17:57 crc kubenswrapper[4756]: I0318 14:17:57.866853 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/869373d9-b980-4c6a-80aa-3ab7a2e046a2-util" (OuterVolumeSpecName: "util") pod "869373d9-b980-4c6a-80aa-3ab7a2e046a2" (UID: "869373d9-b980-4c6a-80aa-3ab7a2e046a2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:17:57 crc kubenswrapper[4756]: I0318 14:17:57.934242 4756 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/869373d9-b980-4c6a-80aa-3ab7a2e046a2-util\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:57 crc kubenswrapper[4756]: I0318 14:17:57.934304 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfbgd\" (UniqueName: \"kubernetes.io/projected/869373d9-b980-4c6a-80aa-3ab7a2e046a2-kube-api-access-lfbgd\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:58 crc kubenswrapper[4756]: I0318 14:17:58.357659 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" event={"ID":"869373d9-b980-4c6a-80aa-3ab7a2e046a2","Type":"ContainerDied","Data":"3bb875ed38209c6a2449a024aed60232969a977de51e72d4c894e90bf7425583"} Mar 18 14:17:58 crc kubenswrapper[4756]: I0318 14:17:58.357738 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bb875ed38209c6a2449a024aed60232969a977de51e72d4c894e90bf7425583" Mar 18 14:17:58 crc kubenswrapper[4756]: I0318 14:17:58.357754 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.144132 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564058-qrjcw"] Mar 18 14:18:00 crc kubenswrapper[4756]: E0318 14:18:00.144586 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869373d9-b980-4c6a-80aa-3ab7a2e046a2" containerName="pull" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.144597 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="869373d9-b980-4c6a-80aa-3ab7a2e046a2" containerName="pull" Mar 18 14:18:00 crc kubenswrapper[4756]: E0318 14:18:00.144613 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869373d9-b980-4c6a-80aa-3ab7a2e046a2" containerName="extract" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.144619 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="869373d9-b980-4c6a-80aa-3ab7a2e046a2" containerName="extract" Mar 18 14:18:00 crc kubenswrapper[4756]: E0318 14:18:00.144631 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869373d9-b980-4c6a-80aa-3ab7a2e046a2" containerName="util" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.144637 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="869373d9-b980-4c6a-80aa-3ab7a2e046a2" containerName="util" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.144762 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="869373d9-b980-4c6a-80aa-3ab7a2e046a2" containerName="extract" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.145183 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564058-qrjcw" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.148355 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.148580 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.150236 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.162070 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564058-qrjcw"] Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.166965 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4djgc\" (UniqueName: \"kubernetes.io/projected/2a05ddf2-3ab0-4c2b-b697-8609cd540ffe-kube-api-access-4djgc\") pod \"auto-csr-approver-29564058-qrjcw\" (UID: \"2a05ddf2-3ab0-4c2b-b697-8609cd540ffe\") " pod="openshift-infra/auto-csr-approver-29564058-qrjcw" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.267829 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4djgc\" (UniqueName: \"kubernetes.io/projected/2a05ddf2-3ab0-4c2b-b697-8609cd540ffe-kube-api-access-4djgc\") pod \"auto-csr-approver-29564058-qrjcw\" (UID: \"2a05ddf2-3ab0-4c2b-b697-8609cd540ffe\") " pod="openshift-infra/auto-csr-approver-29564058-qrjcw" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.290510 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4djgc\" (UniqueName: \"kubernetes.io/projected/2a05ddf2-3ab0-4c2b-b697-8609cd540ffe-kube-api-access-4djgc\") pod \"auto-csr-approver-29564058-qrjcw\" (UID: \"2a05ddf2-3ab0-4c2b-b697-8609cd540ffe\") " pod="openshift-infra/auto-csr-approver-29564058-qrjcw" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.458207 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564058-qrjcw" Mar 18 14:18:00 crc kubenswrapper[4756]: I0318 14:18:00.932084 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564058-qrjcw"] Mar 18 14:18:01 crc kubenswrapper[4756]: I0318 14:18:01.376794 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564058-qrjcw" event={"ID":"2a05ddf2-3ab0-4c2b-b697-8609cd540ffe","Type":"ContainerStarted","Data":"7d1138794b077ac7395cab990eea81c40e967cc54e2f0fd0ede3907d1fb4044d"} Mar 18 14:18:02 crc kubenswrapper[4756]: I0318 14:18:02.385329 4756 generic.go:334] "Generic (PLEG): container finished" podID="2a05ddf2-3ab0-4c2b-b697-8609cd540ffe" containerID="46affe02979f6a10d58e923caec0ceede3e7aef28c406ac6c18fac7f02e67a78" exitCode=0 Mar 18 14:18:02 crc kubenswrapper[4756]: I0318 14:18:02.385382 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564058-qrjcw" event={"ID":"2a05ddf2-3ab0-4c2b-b697-8609cd540ffe","Type":"ContainerDied","Data":"46affe02979f6a10d58e923caec0ceede3e7aef28c406ac6c18fac7f02e67a78"} Mar 18 14:18:03 crc kubenswrapper[4756]: I0318 14:18:03.703850 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564058-qrjcw" Mar 18 14:18:03 crc kubenswrapper[4756]: I0318 14:18:03.818088 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4djgc\" (UniqueName: \"kubernetes.io/projected/2a05ddf2-3ab0-4c2b-b697-8609cd540ffe-kube-api-access-4djgc\") pod \"2a05ddf2-3ab0-4c2b-b697-8609cd540ffe\" (UID: \"2a05ddf2-3ab0-4c2b-b697-8609cd540ffe\") " Mar 18 14:18:03 crc kubenswrapper[4756]: I0318 14:18:03.824186 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a05ddf2-3ab0-4c2b-b697-8609cd540ffe-kube-api-access-4djgc" (OuterVolumeSpecName: "kube-api-access-4djgc") pod "2a05ddf2-3ab0-4c2b-b697-8609cd540ffe" (UID: "2a05ddf2-3ab0-4c2b-b697-8609cd540ffe"). InnerVolumeSpecName "kube-api-access-4djgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:18:03 crc kubenswrapper[4756]: I0318 14:18:03.919377 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4djgc\" (UniqueName: \"kubernetes.io/projected/2a05ddf2-3ab0-4c2b-b697-8609cd540ffe-kube-api-access-4djgc\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.401645 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564058-qrjcw" event={"ID":"2a05ddf2-3ab0-4c2b-b697-8609cd540ffe","Type":"ContainerDied","Data":"7d1138794b077ac7395cab990eea81c40e967cc54e2f0fd0ede3907d1fb4044d"} Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.401682 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d1138794b077ac7395cab990eea81c40e967cc54e2f0fd0ede3907d1fb4044d" Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.402100 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564058-qrjcw" Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.682388 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5847fcc4fb-jhc4m"] Mar 18 14:18:04 crc kubenswrapper[4756]: E0318 14:18:04.682688 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a05ddf2-3ab0-4c2b-b697-8609cd540ffe" containerName="oc" Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.682704 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a05ddf2-3ab0-4c2b-b697-8609cd540ffe" containerName="oc" Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.682869 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a05ddf2-3ab0-4c2b-b697-8609cd540ffe" containerName="oc" Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.683461 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5847fcc4fb-jhc4m" Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.685837 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-zn5hv" Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.710954 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5847fcc4fb-jhc4m"] Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.795675 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564052-m29g7"] Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.802688 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564052-m29g7"] Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.832230 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jgmv\" (UniqueName: \"kubernetes.io/projected/96217702-f9ab-4f27-bcf1-f2bcd60058c0-kube-api-access-2jgmv\") pod \"openstack-operator-controller-init-5847fcc4fb-jhc4m\" (UID: \"96217702-f9ab-4f27-bcf1-f2bcd60058c0\") " pod="openstack-operators/openstack-operator-controller-init-5847fcc4fb-jhc4m" Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.933102 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jgmv\" (UniqueName: \"kubernetes.io/projected/96217702-f9ab-4f27-bcf1-f2bcd60058c0-kube-api-access-2jgmv\") pod \"openstack-operator-controller-init-5847fcc4fb-jhc4m\" (UID: \"96217702-f9ab-4f27-bcf1-f2bcd60058c0\") " pod="openstack-operators/openstack-operator-controller-init-5847fcc4fb-jhc4m" Mar 18 14:18:04 crc kubenswrapper[4756]: I0318 14:18:04.958833 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jgmv\" (UniqueName: \"kubernetes.io/projected/96217702-f9ab-4f27-bcf1-f2bcd60058c0-kube-api-access-2jgmv\") pod \"openstack-operator-controller-init-5847fcc4fb-jhc4m\" (UID: \"96217702-f9ab-4f27-bcf1-f2bcd60058c0\") " pod="openstack-operators/openstack-operator-controller-init-5847fcc4fb-jhc4m" Mar 18 14:18:05 crc kubenswrapper[4756]: I0318 14:18:05.012740 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5847fcc4fb-jhc4m" Mar 18 14:18:05 crc kubenswrapper[4756]: I0318 14:18:05.280551 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5847fcc4fb-jhc4m"] Mar 18 14:18:05 crc kubenswrapper[4756]: W0318 14:18:05.284391 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96217702_f9ab_4f27_bcf1_f2bcd60058c0.slice/crio-13cac94b8ef6dcd8faad1de7d9105fbad1c798216dd0f5ba789e1414c0092835 WatchSource:0}: Error finding container 13cac94b8ef6dcd8faad1de7d9105fbad1c798216dd0f5ba789e1414c0092835: Status 404 returned error can't find the container with id 13cac94b8ef6dcd8faad1de7d9105fbad1c798216dd0f5ba789e1414c0092835 Mar 18 14:18:05 crc kubenswrapper[4756]: I0318 14:18:05.338458 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f37f5eec-136f-4489-9945-c63457566f85" path="/var/lib/kubelet/pods/f37f5eec-136f-4489-9945-c63457566f85/volumes" Mar 18 14:18:05 crc kubenswrapper[4756]: I0318 14:18:05.408814 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5847fcc4fb-jhc4m" event={"ID":"96217702-f9ab-4f27-bcf1-f2bcd60058c0","Type":"ContainerStarted","Data":"13cac94b8ef6dcd8faad1de7d9105fbad1c798216dd0f5ba789e1414c0092835"} Mar 18 14:18:06 crc kubenswrapper[4756]: I0318 14:18:06.915630 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:18:06 crc kubenswrapper[4756]: I0318 14:18:06.916080 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:18:10 crc kubenswrapper[4756]: I0318 14:18:10.452287 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5847fcc4fb-jhc4m" event={"ID":"96217702-f9ab-4f27-bcf1-f2bcd60058c0","Type":"ContainerStarted","Data":"7e777ab379267824d104bf882ce7ebf31a8fb2ac420095312b9b1afd93f5eb99"} Mar 18 14:18:10 crc kubenswrapper[4756]: I0318 14:18:10.452767 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5847fcc4fb-jhc4m" Mar 18 14:18:10 crc kubenswrapper[4756]: I0318 14:18:10.498040 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5847fcc4fb-jhc4m" podStartSLOduration=2.462941307 podStartE2EDuration="6.498010573s" podCreationTimestamp="2026-03-18 14:18:04 +0000 UTC" firstStartedPulling="2026-03-18 14:18:05.286252369 +0000 UTC m=+1086.600670334" lastFinishedPulling="2026-03-18 14:18:09.321321615 +0000 UTC m=+1090.635739600" observedRunningTime="2026-03-18 14:18:10.494867378 +0000 UTC m=+1091.809285353" watchObservedRunningTime="2026-03-18 14:18:10.498010573 +0000 UTC m=+1091.812428588" Mar 18 14:18:15 crc kubenswrapper[4756]: I0318 14:18:15.015862 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5847fcc4fb-jhc4m" Mar 18 14:18:20 crc kubenswrapper[4756]: I0318 14:18:20.521830 4756 scope.go:117] "RemoveContainer" containerID="0e9092047ae6949d66cc88c7ad464f2d450e4c2d8c7f250e2110836600e2c6c5" Mar 18 14:18:36 crc kubenswrapper[4756]: I0318 14:18:36.914920 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:18:36 crc kubenswrapper[4756]: I0318 14:18:36.915504 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.596433 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-gmmtd"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.597175 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-gmmtd" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.599885 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vx4mk" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.607916 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-mskn7"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.608736 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mskn7" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.610774 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-vssrp" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.611270 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mr7c\" (UniqueName: \"kubernetes.io/projected/a5afd6e2-d647-4165-a9ef-506d7e16173c-kube-api-access-7mr7c\") pod \"barbican-operator-controller-manager-59bc569d95-gmmtd\" (UID: \"a5afd6e2-d647-4165-a9ef-506d7e16173c\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-gmmtd" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.611368 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnrtv\" (UniqueName: \"kubernetes.io/projected/1a62c450-4f9c-4c7b-a864-2600eb6c8589-kube-api-access-rnrtv\") pod \"cinder-operator-controller-manager-8d58dc466-mskn7\" (UID: \"1a62c450-4f9c-4c7b-a864-2600eb6c8589\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mskn7" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.616686 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-gmmtd"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.618824 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-mskn7"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.632938 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-mhqlh"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.634395 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mhqlh" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.637585 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-sm745" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.639210 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-p6hhx"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.640457 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-p6hhx" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.641585 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-f4pb5" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.662342 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-p6hhx"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.697997 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-mhqlh"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.712386 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.712906 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf765\" (UniqueName: \"kubernetes.io/projected/aab73c06-7468-4302-ab88-6c91308ca2ac-kube-api-access-jf765\") pod \"glance-operator-controller-manager-79df6bcc97-p6hhx\" (UID: \"aab73c06-7468-4302-ab88-6c91308ca2ac\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-p6hhx" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.713021 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnrtv\" (UniqueName: \"kubernetes.io/projected/1a62c450-4f9c-4c7b-a864-2600eb6c8589-kube-api-access-rnrtv\") pod \"cinder-operator-controller-manager-8d58dc466-mskn7\" (UID: \"1a62c450-4f9c-4c7b-a864-2600eb6c8589\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mskn7" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.713164 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mr7c\" (UniqueName: \"kubernetes.io/projected/a5afd6e2-d647-4165-a9ef-506d7e16173c-kube-api-access-7mr7c\") pod \"barbican-operator-controller-manager-59bc569d95-gmmtd\" (UID: \"a5afd6e2-d647-4165-a9ef-506d7e16173c\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-gmmtd" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.713254 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2nrq\" (UniqueName: \"kubernetes.io/projected/d05621f7-0f1e-4b58-b016-6cdb083fed42-kube-api-access-k2nrq\") pod \"designate-operator-controller-manager-588d4d986b-mhqlh\" (UID: \"d05621f7-0f1e-4b58-b016-6cdb083fed42\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mhqlh" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.713497 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.719321 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-x524k" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.729233 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-vgbf2"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.730109 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vgbf2" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.733606 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-tcjmz" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.749447 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnrtv\" (UniqueName: \"kubernetes.io/projected/1a62c450-4f9c-4c7b-a864-2600eb6c8589-kube-api-access-rnrtv\") pod \"cinder-operator-controller-manager-8d58dc466-mskn7\" (UID: \"1a62c450-4f9c-4c7b-a864-2600eb6c8589\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mskn7" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.753038 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mr7c\" (UniqueName: \"kubernetes.io/projected/a5afd6e2-d647-4165-a9ef-506d7e16173c-kube-api-access-7mr7c\") pod \"barbican-operator-controller-manager-59bc569d95-gmmtd\" (UID: \"a5afd6e2-d647-4165-a9ef-506d7e16173c\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-gmmtd" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.757577 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.783188 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-vgbf2"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.786409 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.787177 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.788981 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.789059 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4bfzt" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.795843 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.803516 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-wx92j"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.804395 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wx92j" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.816511 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-g65jn" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.817845 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2nrq\" (UniqueName: \"kubernetes.io/projected/d05621f7-0f1e-4b58-b016-6cdb083fed42-kube-api-access-k2nrq\") pod \"designate-operator-controller-manager-588d4d986b-mhqlh\" (UID: \"d05621f7-0f1e-4b58-b016-6cdb083fed42\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mhqlh" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.817881 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctfmz\" (UniqueName: \"kubernetes.io/projected/400b73ce-6d8f-4392-b47b-fef88e8452bd-kube-api-access-ctfmz\") pod \"ironic-operator-controller-manager-6f787dddc9-wx92j\" (UID: \"400b73ce-6d8f-4392-b47b-fef88e8452bd\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wx92j" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.817903 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stk9l\" (UniqueName: \"kubernetes.io/projected/ba85373b-8f2d-4f13-8ea7-0648b49074da-kube-api-access-stk9l\") pod \"infra-operator-controller-manager-7b9c774f96-9lwv9\" (UID: \"ba85373b-8f2d-4f13-8ea7-0648b49074da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.817922 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jxnv\" (UniqueName: \"kubernetes.io/projected/6e2eb455-f2ff-40c0-9c26-67c675c1102f-kube-api-access-4jxnv\") pod \"heat-operator-controller-manager-67dd5f86f5-k87mt\" (UID: \"6e2eb455-f2ff-40c0-9c26-67c675c1102f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.817968 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf765\" (UniqueName: \"kubernetes.io/projected/aab73c06-7468-4302-ab88-6c91308ca2ac-kube-api-access-jf765\") pod \"glance-operator-controller-manager-79df6bcc97-p6hhx\" (UID: \"aab73c06-7468-4302-ab88-6c91308ca2ac\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-p6hhx" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.818002 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-9lwv9\" (UID: \"ba85373b-8f2d-4f13-8ea7-0648b49074da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.818039 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdwhg\" (UniqueName: \"kubernetes.io/projected/67b596ac-2e44-42b4-99b2-fd1c8712aaed-kube-api-access-rdwhg\") pod \"horizon-operator-controller-manager-8464cc45fb-vgbf2\" (UID: \"67b596ac-2e44-42b4-99b2-fd1c8712aaed\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vgbf2" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.818307 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-br68j"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.819160 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-br68j" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.821962 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-zjklh" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.829644 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-bz77b"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.830465 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bz77b" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.840172 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.841011 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.843757 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-wx92j"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.848730 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-bz77b"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.849338 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-dbml7" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.854797 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.855267 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-dqlx8" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.866535 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-br68j"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.869755 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2nrq\" (UniqueName: \"kubernetes.io/projected/d05621f7-0f1e-4b58-b016-6cdb083fed42-kube-api-access-k2nrq\") pod \"designate-operator-controller-manager-588d4d986b-mhqlh\" (UID: \"d05621f7-0f1e-4b58-b016-6cdb083fed42\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mhqlh" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.891567 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-ggbzf"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.892339 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-fk8pc"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.897653 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-ggbzf"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.897737 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-fk8pc" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.898051 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-ggbzf" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.901571 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-ccjm6" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.901775 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-wbf4l" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.903781 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf765\" (UniqueName: \"kubernetes.io/projected/aab73c06-7468-4302-ab88-6c91308ca2ac-kube-api-access-jf765\") pod \"glance-operator-controller-manager-79df6bcc97-p6hhx\" (UID: \"aab73c06-7468-4302-ab88-6c91308ca2ac\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-p6hhx" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.903851 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-fk8pc"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.910172 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-fzxth"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.911027 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fzxth" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.912633 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wcvx7" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.917836 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-gmmtd" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.918875 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctfmz\" (UniqueName: \"kubernetes.io/projected/400b73ce-6d8f-4392-b47b-fef88e8452bd-kube-api-access-ctfmz\") pod \"ironic-operator-controller-manager-6f787dddc9-wx92j\" (UID: \"400b73ce-6d8f-4392-b47b-fef88e8452bd\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wx92j" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.919141 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stk9l\" (UniqueName: \"kubernetes.io/projected/ba85373b-8f2d-4f13-8ea7-0648b49074da-kube-api-access-stk9l\") pod \"infra-operator-controller-manager-7b9c774f96-9lwv9\" (UID: \"ba85373b-8f2d-4f13-8ea7-0648b49074da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.919166 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jxnv\" (UniqueName: \"kubernetes.io/projected/6e2eb455-f2ff-40c0-9c26-67c675c1102f-kube-api-access-4jxnv\") pod \"heat-operator-controller-manager-67dd5f86f5-k87mt\" (UID: \"6e2eb455-f2ff-40c0-9c26-67c675c1102f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.919187 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-682qx\" (UniqueName: \"kubernetes.io/projected/83526c83-3dbe-42b4-a101-7ce37495b4cf-kube-api-access-682qx\") pod \"neutron-operator-controller-manager-767865f676-fk8pc\" (UID: \"83526c83-3dbe-42b4-a101-7ce37495b4cf\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-fk8pc" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.919210 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tb8g\" (UniqueName: \"kubernetes.io/projected/dfcb60c2-5eba-4f32-a738-8c79d6c36df7-kube-api-access-4tb8g\") pod \"manila-operator-controller-manager-55f864c847-bz77b\" (UID: \"dfcb60c2-5eba-4f32-a738-8c79d6c36df7\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-bz77b" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.919228 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2784\" (UniqueName: \"kubernetes.io/projected/5c617272-2409-489c-8093-3c943a117a23-kube-api-access-c2784\") pod \"octavia-operator-controller-manager-5b9f45d989-fzxth\" (UID: \"5c617272-2409-489c-8093-3c943a117a23\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fzxth" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.919247 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grp6b\" (UniqueName: \"kubernetes.io/projected/2efade7a-e4d1-43ab-a237-139b25a4163c-kube-api-access-grp6b\") pod \"mariadb-operator-controller-manager-67ccfc9778-br68j\" (UID: \"2efade7a-e4d1-43ab-a237-139b25a4163c\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-br68j" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.919291 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbr9t\" (UniqueName: \"kubernetes.io/projected/e377c7f8-bd46-4193-a288-91f593cc5a25-kube-api-access-qbr9t\") pod \"nova-operator-controller-manager-5d488d59fb-ggbzf\" (UID: \"e377c7f8-bd46-4193-a288-91f593cc5a25\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-ggbzf" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.919325 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-9lwv9\" (UID: \"ba85373b-8f2d-4f13-8ea7-0648b49074da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.919363 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrj6c\" (UniqueName: \"kubernetes.io/projected/a632ee17-dd9e-4ec8-b281-7224395bd2fe-kube-api-access-qrj6c\") pod \"keystone-operator-controller-manager-768b96df4c-hwrx5\" (UID: \"a632ee17-dd9e-4ec8-b281-7224395bd2fe\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.919380 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwhg\" (UniqueName: \"kubernetes.io/projected/67b596ac-2e44-42b4-99b2-fd1c8712aaed-kube-api-access-rdwhg\") pod \"horizon-operator-controller-manager-8464cc45fb-vgbf2\" (UID: \"67b596ac-2e44-42b4-99b2-fd1c8712aaed\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vgbf2" Mar 18 14:18:37 crc kubenswrapper[4756]: E0318 14:18:37.919964 4756 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 14:18:37 crc kubenswrapper[4756]: E0318 14:18:37.920003 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert podName:ba85373b-8f2d-4f13-8ea7-0648b49074da nodeName:}" failed. No retries permitted until 2026-03-18 14:18:38.419988383 +0000 UTC m=+1119.734406358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert") pod "infra-operator-controller-manager-7b9c774f96-9lwv9" (UID: "ba85373b-8f2d-4f13-8ea7-0648b49074da") : secret "infra-operator-webhook-server-cert" not found Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.927162 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-fzxth"] Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.945761 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mskn7" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.949816 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwhg\" (UniqueName: \"kubernetes.io/projected/67b596ac-2e44-42b4-99b2-fd1c8712aaed-kube-api-access-rdwhg\") pod \"horizon-operator-controller-manager-8464cc45fb-vgbf2\" (UID: \"67b596ac-2e44-42b4-99b2-fd1c8712aaed\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vgbf2" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.986031 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mhqlh" Mar 18 14:18:37 crc kubenswrapper[4756]: I0318 14:18:37.996426 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-p6hhx" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.001402 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jxnv\" (UniqueName: \"kubernetes.io/projected/6e2eb455-f2ff-40c0-9c26-67c675c1102f-kube-api-access-4jxnv\") pod \"heat-operator-controller-manager-67dd5f86f5-k87mt\" (UID: \"6e2eb455-f2ff-40c0-9c26-67c675c1102f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.011573 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.016233 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctfmz\" (UniqueName: \"kubernetes.io/projected/400b73ce-6d8f-4392-b47b-fef88e8452bd-kube-api-access-ctfmz\") pod \"ironic-operator-controller-manager-6f787dddc9-wx92j\" (UID: \"400b73ce-6d8f-4392-b47b-fef88e8452bd\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wx92j" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.022758 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stk9l\" (UniqueName: \"kubernetes.io/projected/ba85373b-8f2d-4f13-8ea7-0648b49074da-kube-api-access-stk9l\") pod \"infra-operator-controller-manager-7b9c774f96-9lwv9\" (UID: \"ba85373b-8f2d-4f13-8ea7-0648b49074da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.023383 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrj6c\" (UniqueName: \"kubernetes.io/projected/a632ee17-dd9e-4ec8-b281-7224395bd2fe-kube-api-access-qrj6c\") pod \"keystone-operator-controller-manager-768b96df4c-hwrx5\" (UID: \"a632ee17-dd9e-4ec8-b281-7224395bd2fe\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.023430 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-682qx\" (UniqueName: \"kubernetes.io/projected/83526c83-3dbe-42b4-a101-7ce37495b4cf-kube-api-access-682qx\") pod \"neutron-operator-controller-manager-767865f676-fk8pc\" (UID: \"83526c83-3dbe-42b4-a101-7ce37495b4cf\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-fk8pc" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.023453 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tb8g\" (UniqueName: \"kubernetes.io/projected/dfcb60c2-5eba-4f32-a738-8c79d6c36df7-kube-api-access-4tb8g\") pod \"manila-operator-controller-manager-55f864c847-bz77b\" (UID: \"dfcb60c2-5eba-4f32-a738-8c79d6c36df7\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-bz77b" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.023471 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2784\" (UniqueName: \"kubernetes.io/projected/5c617272-2409-489c-8093-3c943a117a23-kube-api-access-c2784\") pod \"octavia-operator-controller-manager-5b9f45d989-fzxth\" (UID: \"5c617272-2409-489c-8093-3c943a117a23\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fzxth" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.023485 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grp6b\" (UniqueName: \"kubernetes.io/projected/2efade7a-e4d1-43ab-a237-139b25a4163c-kube-api-access-grp6b\") pod \"mariadb-operator-controller-manager-67ccfc9778-br68j\" (UID: \"2efade7a-e4d1-43ab-a237-139b25a4163c\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-br68j" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.023527 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbr9t\" (UniqueName: \"kubernetes.io/projected/e377c7f8-bd46-4193-a288-91f593cc5a25-kube-api-access-qbr9t\") pod \"nova-operator-controller-manager-5d488d59fb-ggbzf\" (UID: \"e377c7f8-bd46-4193-a288-91f593cc5a25\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-ggbzf" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.036486 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.037294 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.037646 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.039075 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.053319 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-k6gpj"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.054377 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vgbf2" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.054774 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-k6gpj" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.055089 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-tgnfm" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.055234 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.056283 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrj6c\" (UniqueName: \"kubernetes.io/projected/a632ee17-dd9e-4ec8-b281-7224395bd2fe-kube-api-access-qrj6c\") pod \"keystone-operator-controller-manager-768b96df4c-hwrx5\" (UID: \"a632ee17-dd9e-4ec8-b281-7224395bd2fe\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.056898 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-pc626" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.057424 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-drnrr" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.064859 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tb8g\" (UniqueName: \"kubernetes.io/projected/dfcb60c2-5eba-4f32-a738-8c79d6c36df7-kube-api-access-4tb8g\") pod \"manila-operator-controller-manager-55f864c847-bz77b\" (UID: \"dfcb60c2-5eba-4f32-a738-8c79d6c36df7\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-bz77b" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.077090 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-682qx\" (UniqueName: \"kubernetes.io/projected/83526c83-3dbe-42b4-a101-7ce37495b4cf-kube-api-access-682qx\") pod \"neutron-operator-controller-manager-767865f676-fk8pc\" (UID: \"83526c83-3dbe-42b4-a101-7ce37495b4cf\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-fk8pc" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.082174 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.084076 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grp6b\" (UniqueName: \"kubernetes.io/projected/2efade7a-e4d1-43ab-a237-139b25a4163c-kube-api-access-grp6b\") pod \"mariadb-operator-controller-manager-67ccfc9778-br68j\" (UID: \"2efade7a-e4d1-43ab-a237-139b25a4163c\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-br68j" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.084608 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2784\" (UniqueName: \"kubernetes.io/projected/5c617272-2409-489c-8093-3c943a117a23-kube-api-access-c2784\") pod \"octavia-operator-controller-manager-5b9f45d989-fzxth\" (UID: \"5c617272-2409-489c-8093-3c943a117a23\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fzxth" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.086818 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbr9t\" (UniqueName: \"kubernetes.io/projected/e377c7f8-bd46-4193-a288-91f593cc5a25-kube-api-access-qbr9t\") pod \"nova-operator-controller-manager-5d488d59fb-ggbzf\" (UID: \"e377c7f8-bd46-4193-a288-91f593cc5a25\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-ggbzf" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.096556 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-fk8pc" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.124560 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cqpq\" (UniqueName: \"kubernetes.io/projected/c469b9e9-509c-4265-bc75-3d80d75c4365-kube-api-access-5cqpq\") pod \"placement-operator-controller-manager-5784578c99-k6gpj\" (UID: \"c469b9e9-509c-4265-bc75-3d80d75c4365\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-k6gpj" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.124880 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnpgr\" (UniqueName: \"kubernetes.io/projected/f5e9df6f-3f17-4a97-a335-332fb636f9dd-kube-api-access-dnpgr\") pod \"ovn-operator-controller-manager-884679f54-j9sw7\" (UID: \"f5e9df6f-3f17-4a97-a335-332fb636f9dd\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.125033 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjcd5\" (UniqueName: \"kubernetes.io/projected/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-kube-api-access-fjcd5\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m846d\" (UID: \"d48d57d1-c314-4a7c-bd51-26ec5cfebbd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.125215 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m846d\" (UID: \"d48d57d1-c314-4a7c-bd51-26ec5cfebbd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.136486 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-ggbzf" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.144057 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.144355 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wx92j" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.187198 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-br68j" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.187996 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fzxth" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.221668 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.226368 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjcd5\" (UniqueName: \"kubernetes.io/projected/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-kube-api-access-fjcd5\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m846d\" (UID: \"d48d57d1-c314-4a7c-bd51-26ec5cfebbd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.227451 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m846d\" (UID: \"d48d57d1-c314-4a7c-bd51-26ec5cfebbd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.227627 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cqpq\" (UniqueName: \"kubernetes.io/projected/c469b9e9-509c-4265-bc75-3d80d75c4365-kube-api-access-5cqpq\") pod \"placement-operator-controller-manager-5784578c99-k6gpj\" (UID: \"c469b9e9-509c-4265-bc75-3d80d75c4365\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-k6gpj" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.227724 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnpgr\" (UniqueName: \"kubernetes.io/projected/f5e9df6f-3f17-4a97-a335-332fb636f9dd-kube-api-access-dnpgr\") pod \"ovn-operator-controller-manager-884679f54-j9sw7\" (UID: \"f5e9df6f-3f17-4a97-a335-332fb636f9dd\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7" Mar 18 14:18:38 crc kubenswrapper[4756]: E0318 14:18:38.228469 4756 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 14:18:38 crc kubenswrapper[4756]: E0318 14:18:38.231357 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert podName:d48d57d1-c314-4a7c-bd51-26ec5cfebbd1 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:38.731318683 +0000 UTC m=+1120.045736658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-m846d" (UID: "d48d57d1-c314-4a7c-bd51-26ec5cfebbd1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.244409 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.244506 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.265340 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-k6gpj"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.265600 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rq85f" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.271839 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cqpq\" (UniqueName: \"kubernetes.io/projected/c469b9e9-509c-4265-bc75-3d80d75c4365-kube-api-access-5cqpq\") pod \"placement-operator-controller-manager-5784578c99-k6gpj\" (UID: \"c469b9e9-509c-4265-bc75-3d80d75c4365\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-k6gpj" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.274041 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjcd5\" (UniqueName: \"kubernetes.io/projected/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-kube-api-access-fjcd5\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m846d\" (UID: \"d48d57d1-c314-4a7c-bd51-26ec5cfebbd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.309360 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bz77b" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.312901 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnpgr\" (UniqueName: \"kubernetes.io/projected/f5e9df6f-3f17-4a97-a335-332fb636f9dd-kube-api-access-dnpgr\") pod \"ovn-operator-controller-manager-884679f54-j9sw7\" (UID: \"f5e9df6f-3f17-4a97-a335-332fb636f9dd\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.339584 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.344930 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.345852 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.351149 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-tf2dg" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.386272 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.396991 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.398175 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.400141 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-px9w6" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.402217 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.426697 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.427674 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.430680 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-8dcz8" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.431727 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh58q\" (UniqueName: \"kubernetes.io/projected/d87d4d63-23ac-4366-88d2-5d8803a7322e-kube-api-access-hh58q\") pod \"swift-operator-controller-manager-c674c5965-4pz7p\" (UID: \"d87d4d63-23ac-4366-88d2-5d8803a7322e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.431808 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-9lwv9\" (UID: \"ba85373b-8f2d-4f13-8ea7-0648b49074da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:18:38 crc kubenswrapper[4756]: E0318 14:18:38.431921 4756 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 14:18:38 crc kubenswrapper[4756]: E0318 14:18:38.432146 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert podName:ba85373b-8f2d-4f13-8ea7-0648b49074da nodeName:}" failed. No retries permitted until 2026-03-18 14:18:39.432131809 +0000 UTC m=+1120.746549784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert") pod "infra-operator-controller-manager-7b9c774f96-9lwv9" (UID: "ba85373b-8f2d-4f13-8ea7-0648b49074da") : secret "infra-operator-webhook-server-cert" not found Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.432880 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.450580 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-k6gpj" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.451044 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.451921 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.456433 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.457482 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.457792 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-r75tc" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.458559 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.479444 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.480390 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.481929 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-dgrkk" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.511353 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.533748 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xl5n\" (UniqueName: \"kubernetes.io/projected/1187469f-b925-4823-8e9e-f3721c8b299b-kube-api-access-9xl5n\") pod \"test-operator-controller-manager-5c5cb9c4d7-wsh26\" (UID: \"1187469f-b925-4823-8e9e-f3721c8b299b\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.533836 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d2fn\" (UniqueName: \"kubernetes.io/projected/726ef03d-4a13-449c-8866-c6f5ee240873-kube-api-access-6d2fn\") pod \"telemetry-operator-controller-manager-5b79d7bc79-kn5zn\" (UID: \"726ef03d-4a13-449c-8866-c6f5ee240873\") " pod="openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.533865 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh58q\" (UniqueName: \"kubernetes.io/projected/d87d4d63-23ac-4366-88d2-5d8803a7322e-kube-api-access-hh58q\") pod \"swift-operator-controller-manager-c674c5965-4pz7p\" (UID: \"d87d4d63-23ac-4366-88d2-5d8803a7322e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.533891 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xlvs\" (UniqueName: \"kubernetes.io/projected/0a54e6d2-b6e2-4808-8c1d-e12b975702cb-kube-api-access-7xlvs\") pod \"watcher-operator-controller-manager-6c4d75f7f9-phw4g\" (UID: \"0a54e6d2-b6e2-4808-8c1d-e12b975702cb\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.581487 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.588150 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh58q\" (UniqueName: \"kubernetes.io/projected/d87d4d63-23ac-4366-88d2-5d8803a7322e-kube-api-access-hh58q\") pod \"swift-operator-controller-manager-c674c5965-4pz7p\" (UID: \"d87d4d63-23ac-4366-88d2-5d8803a7322e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.636131 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.636380 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf5sf\" (UniqueName: \"kubernetes.io/projected/c37f78a1-6298-4f65-9dac-6f597ed75a31-kube-api-access-zf5sf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-sd7vp\" (UID: \"c37f78a1-6298-4f65-9dac-6f597ed75a31\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.636509 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xl5n\" (UniqueName: \"kubernetes.io/projected/1187469f-b925-4823-8e9e-f3721c8b299b-kube-api-access-9xl5n\") pod \"test-operator-controller-manager-5c5cb9c4d7-wsh26\" (UID: \"1187469f-b925-4823-8e9e-f3721c8b299b\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.636613 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d2fn\" (UniqueName: \"kubernetes.io/projected/726ef03d-4a13-449c-8866-c6f5ee240873-kube-api-access-6d2fn\") pod \"telemetry-operator-controller-manager-5b79d7bc79-kn5zn\" (UID: \"726ef03d-4a13-449c-8866-c6f5ee240873\") " pod="openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.636646 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.636675 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9hvv\" (UniqueName: \"kubernetes.io/projected/d78c665d-9f25-4d44-80eb-12324454e435-kube-api-access-l9hvv\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.636719 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xlvs\" (UniqueName: \"kubernetes.io/projected/0a54e6d2-b6e2-4808-8c1d-e12b975702cb-kube-api-access-7xlvs\") pod \"watcher-operator-controller-manager-6c4d75f7f9-phw4g\" (UID: \"0a54e6d2-b6e2-4808-8c1d-e12b975702cb\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.656977 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xlvs\" (UniqueName: \"kubernetes.io/projected/0a54e6d2-b6e2-4808-8c1d-e12b975702cb-kube-api-access-7xlvs\") pod \"watcher-operator-controller-manager-6c4d75f7f9-phw4g\" (UID: \"0a54e6d2-b6e2-4808-8c1d-e12b975702cb\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.657522 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xl5n\" (UniqueName: \"kubernetes.io/projected/1187469f-b925-4823-8e9e-f3721c8b299b-kube-api-access-9xl5n\") pod \"test-operator-controller-manager-5c5cb9c4d7-wsh26\" (UID: \"1187469f-b925-4823-8e9e-f3721c8b299b\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.659874 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d2fn\" (UniqueName: \"kubernetes.io/projected/726ef03d-4a13-449c-8866-c6f5ee240873-kube-api-access-6d2fn\") pod \"telemetry-operator-controller-manager-5b79d7bc79-kn5zn\" (UID: \"726ef03d-4a13-449c-8866-c6f5ee240873\") " pod="openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.737848 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.737894 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf5sf\" (UniqueName: \"kubernetes.io/projected/c37f78a1-6298-4f65-9dac-6f597ed75a31-kube-api-access-zf5sf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-sd7vp\" (UID: \"c37f78a1-6298-4f65-9dac-6f597ed75a31\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.737930 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m846d\" (UID: \"d48d57d1-c314-4a7c-bd51-26ec5cfebbd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.737982 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.738002 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9hvv\" (UniqueName: \"kubernetes.io/projected/d78c665d-9f25-4d44-80eb-12324454e435-kube-api-access-l9hvv\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:38 crc kubenswrapper[4756]: E0318 14:18:38.738444 4756 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 14:18:38 crc kubenswrapper[4756]: E0318 14:18:38.738488 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs podName:d78c665d-9f25-4d44-80eb-12324454e435 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:39.238475214 +0000 UTC m=+1120.552893189 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs") pod "openstack-operator-controller-manager-f84d7fd4f-tksnb" (UID: "d78c665d-9f25-4d44-80eb-12324454e435") : secret "webhook-server-cert" not found Mar 18 14:18:38 crc kubenswrapper[4756]: E0318 14:18:38.738718 4756 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 14:18:38 crc kubenswrapper[4756]: E0318 14:18:38.738747 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert podName:d48d57d1-c314-4a7c-bd51-26ec5cfebbd1 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:39.738739401 +0000 UTC m=+1121.053157376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-m846d" (UID: "d48d57d1-c314-4a7c-bd51-26ec5cfebbd1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 14:18:38 crc kubenswrapper[4756]: E0318 14:18:38.738780 4756 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 14:18:38 crc kubenswrapper[4756]: E0318 14:18:38.738800 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs podName:d78c665d-9f25-4d44-80eb-12324454e435 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:39.238794913 +0000 UTC m=+1120.553212878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs") pod "openstack-operator-controller-manager-f84d7fd4f-tksnb" (UID: "d78c665d-9f25-4d44-80eb-12324454e435") : secret "metrics-server-cert" not found Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.765236 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.765841 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9hvv\" (UniqueName: \"kubernetes.io/projected/d78c665d-9f25-4d44-80eb-12324454e435-kube-api-access-l9hvv\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.778796 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf5sf\" (UniqueName: \"kubernetes.io/projected/c37f78a1-6298-4f65-9dac-6f597ed75a31-kube-api-access-zf5sf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-sd7vp\" (UID: \"c37f78a1-6298-4f65-9dac-6f597ed75a31\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.801620 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.816460 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.850041 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g" Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.876084 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-p6hhx"] Mar 18 14:18:38 crc kubenswrapper[4756]: I0318 14:18:38.936196 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp" Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.244844 4756 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.245603 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.245754 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.245854 4756 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.245905 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs podName:d78c665d-9f25-4d44-80eb-12324454e435 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:40.245880351 +0000 UTC m=+1121.560298326 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs") pod "openstack-operator-controller-manager-f84d7fd4f-tksnb" (UID: "d78c665d-9f25-4d44-80eb-12324454e435") : secret "webhook-server-cert" not found Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.245927 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs podName:d78c665d-9f25-4d44-80eb-12324454e435 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:40.245919252 +0000 UTC m=+1121.560337227 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs") pod "openstack-operator-controller-manager-f84d7fd4f-tksnb" (UID: "d78c665d-9f25-4d44-80eb-12324454e435") : secret "metrics-server-cert" not found Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.267080 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-gmmtd"] Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.288276 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-ggbzf"] Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.317454 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-mhqlh"] Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.376385 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-mskn7"] Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.376417 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-vgbf2"] Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.467458 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-9lwv9\" (UID: \"ba85373b-8f2d-4f13-8ea7-0648b49074da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.467650 4756 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.467734 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert podName:ba85373b-8f2d-4f13-8ea7-0648b49074da nodeName:}" failed. No retries permitted until 2026-03-18 14:18:41.467703814 +0000 UTC m=+1122.782121789 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert") pod "infra-operator-controller-manager-7b9c774f96-9lwv9" (UID: "ba85373b-8f2d-4f13-8ea7-0648b49074da") : secret "infra-operator-webhook-server-cert" not found Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.577029 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-wx92j"] Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.583863 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-fzxth"] Mar 18 14:18:39 crc kubenswrapper[4756]: W0318 14:18:39.590357 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83526c83_3dbe_42b4_a101_7ce37495b4cf.slice/crio-ca2fa8eb1c546cc980a9541d6fbbf212930103f40a3145a0126a1bedc42136e5 WatchSource:0}: Error finding container ca2fa8eb1c546cc980a9541d6fbbf212930103f40a3145a0126a1bedc42136e5: Status 404 returned error can't find the container with id ca2fa8eb1c546cc980a9541d6fbbf212930103f40a3145a0126a1bedc42136e5 Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.598613 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-fk8pc"] Mar 18 14:18:39 crc kubenswrapper[4756]: W0318 14:18:39.603847 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc469b9e9_509c_4265_bc75_3d80d75c4365.slice/crio-d7707d2f7a26d098900798b2a8880b28fbab9d80e1762725f3bd0c0a042ce08a WatchSource:0}: Error finding container d7707d2f7a26d098900798b2a8880b28fbab9d80e1762725f3bd0c0a042ce08a: Status 404 returned error can't find the container with id d7707d2f7a26d098900798b2a8880b28fbab9d80e1762725f3bd0c0a042ce08a Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.604059 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-k6gpj"] Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.608710 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-br68j"] Mar 18 14:18:39 crc kubenswrapper[4756]: W0318 14:18:39.613342 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c617272_2409_489c_8093_3c943a117a23.slice/crio-c8ecaab3c01425eba920a1d13c427806a87bbc454813a1ae315e31818e3d0c49 WatchSource:0}: Error finding container c8ecaab3c01425eba920a1d13c427806a87bbc454813a1ae315e31818e3d0c49: Status 404 returned error can't find the container with id c8ecaab3c01425eba920a1d13c427806a87bbc454813a1ae315e31818e3d0c49 Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.615075 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26"] Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.619434 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-bz77b"] Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.629370 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9xl5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-wsh26_openstack-operators(1187469f-b925-4823-8e9e-f3721c8b299b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.629831 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7"] Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.632303 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26" podUID="1187469f-b925-4823-8e9e-f3721c8b299b" Mar 18 14:18:39 crc kubenswrapper[4756]: W0318 14:18:39.633823 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e2eb455_f2ff_40c0_9c26_67c675c1102f.slice/crio-25091fd7084cc237b33a68183315a54a074d47c4ee28e5b733bd96e96890e6a8 WatchSource:0}: Error finding container 25091fd7084cc237b33a68183315a54a074d47c4ee28e5b733bd96e96890e6a8: Status 404 returned error can't find the container with id 25091fd7084cc237b33a68183315a54a074d47c4ee28e5b733bd96e96890e6a8 Mar 18 14:18:39 crc kubenswrapper[4756]: W0318 14:18:39.634933 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5e9df6f_3f17_4a97_a335_332fb636f9dd.slice/crio-f7375478422ab28881f64e0396958d74f38f27cfe775bc418f0eefab493bd6f2 WatchSource:0}: Error finding container f7375478422ab28881f64e0396958d74f38f27cfe775bc418f0eefab493bd6f2: Status 404 returned error can't find the container with id f7375478422ab28881f64e0396958d74f38f27cfe775bc418f0eefab493bd6f2 Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.635734 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt"] Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.639096 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4jxnv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-67dd5f86f5-k87mt_openstack-operators(6e2eb455-f2ff-40c0-9c26-67c675c1102f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.639264 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dnpgr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-j9sw7_openstack-operators(f5e9df6f-3f17-4a97-a335-332fb636f9dd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.640543 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g"] Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.640619 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7" podUID="f5e9df6f-3f17-4a97-a335-332fb636f9dd" Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.640671 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt" podUID="6e2eb455-f2ff-40c0-9c26-67c675c1102f" Mar 18 14:18:39 crc kubenswrapper[4756]: W0318 14:18:39.646708 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a54e6d2_b6e2_4808_8c1d_e12b975702cb.slice/crio-b5b2c39286749a16ff5d1a00e554d2be38adff9523b6ea08a951b2b84e43c7d0 WatchSource:0}: Error finding container b5b2c39286749a16ff5d1a00e554d2be38adff9523b6ea08a951b2b84e43c7d0: Status 404 returned error can't find the container with id b5b2c39286749a16ff5d1a00e554d2be38adff9523b6ea08a951b2b84e43c7d0 Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.647811 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5"] Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.648411 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7xlvs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-phw4g_openstack-operators(0a54e6d2-b6e2-4808-8c1d-e12b975702cb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.649994 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g" podUID="0a54e6d2-b6e2-4808-8c1d-e12b975702cb" Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.656794 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wx92j" event={"ID":"400b73ce-6d8f-4392-b47b-fef88e8452bd","Type":"ContainerStarted","Data":"282f3bb0f56760a2319899b9167363ca80d707cc5de0d643eb06871d8edcb1fd"} Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.657133 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qrj6c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-hwrx5_openstack-operators(a632ee17-dd9e-4ec8-b281-7224395bd2fe): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.658230 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5" podUID="a632ee17-dd9e-4ec8-b281-7224395bd2fe" Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.659300 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hh58q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-4pz7p_openstack-operators(d87d4d63-23ac-4366-88d2-5d8803a7322e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.659545 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.194:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6d2fn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5b79d7bc79-kn5zn_openstack-operators(726ef03d-4a13-449c-8866-c6f5ee240873): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.660251 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-br68j" event={"ID":"2efade7a-e4d1-43ab-a237-139b25a4163c","Type":"ContainerStarted","Data":"4e3ad7314da016ab8a0c357c787c48935c02c66e26c4955a808667ba837ba4ac"} Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.660486 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p" podUID="d87d4d63-23ac-4366-88d2-5d8803a7322e" Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.660720 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn" podUID="726ef03d-4a13-449c-8866-c6f5ee240873" Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.660759 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn"] Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.663617 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fzxth" event={"ID":"5c617272-2409-489c-8093-3c943a117a23","Type":"ContainerStarted","Data":"c8ecaab3c01425eba920a1d13c427806a87bbc454813a1ae315e31818e3d0c49"} Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.665904 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p"] Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.667194 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mhqlh" event={"ID":"d05621f7-0f1e-4b58-b016-6cdb083fed42","Type":"ContainerStarted","Data":"0ee723635402d9cff16bf9b7121ae4898e8af0302f5e044dd1234a304811f477"} Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.667238 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zf5sf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-sd7vp_openstack-operators(c37f78a1-6298-4f65-9dac-6f597ed75a31): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.668327 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bz77b" event={"ID":"dfcb60c2-5eba-4f32-a738-8c79d6c36df7","Type":"ContainerStarted","Data":"34525d92f32b637f746ff1f4482c8974d1c92d5396b6c32232b75b9da3d75921"} Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.668328 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp" podUID="c37f78a1-6298-4f65-9dac-6f597ed75a31" Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.670063 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-gmmtd" event={"ID":"a5afd6e2-d647-4165-a9ef-506d7e16173c","Type":"ContainerStarted","Data":"dccf8ab1a6c83190e42d9e460f45fd967a573742a43fd0bc9c395a870a610a1e"} Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.671470 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-k6gpj" event={"ID":"c469b9e9-509c-4265-bc75-3d80d75c4365","Type":"ContainerStarted","Data":"d7707d2f7a26d098900798b2a8880b28fbab9d80e1762725f3bd0c0a042ce08a"} Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.671655 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp"] Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.672343 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-p6hhx" event={"ID":"aab73c06-7468-4302-ab88-6c91308ca2ac","Type":"ContainerStarted","Data":"c21320960c1b7420ab7bbfa685bb7321b652a93f6572b480e8c7dea77d083012"} Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.673329 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mskn7" event={"ID":"1a62c450-4f9c-4c7b-a864-2600eb6c8589","Type":"ContainerStarted","Data":"772b916cd875edfedc220815a06de8b70794098eb27b01a78a2a6d3a2f145fba"} Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.674263 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7" event={"ID":"f5e9df6f-3f17-4a97-a335-332fb636f9dd","Type":"ContainerStarted","Data":"f7375478422ab28881f64e0396958d74f38f27cfe775bc418f0eefab493bd6f2"} Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.675821 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7" podUID="f5e9df6f-3f17-4a97-a335-332fb636f9dd" Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.675847 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-ggbzf" event={"ID":"e377c7f8-bd46-4193-a288-91f593cc5a25","Type":"ContainerStarted","Data":"513f9cd1e5d99c2b63ec560a9e864f3f7f204eb99a50a4ed4e3c42332fc58152"} Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.676647 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-fk8pc" event={"ID":"83526c83-3dbe-42b4-a101-7ce37495b4cf","Type":"ContainerStarted","Data":"ca2fa8eb1c546cc980a9541d6fbbf212930103f40a3145a0126a1bedc42136e5"} Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.677411 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26" event={"ID":"1187469f-b925-4823-8e9e-f3721c8b299b","Type":"ContainerStarted","Data":"82a0ee69b0b4e998a32fe56c71acc8c9546a875ef90b5fcb9cb1afeb7f1c7957"} Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.678680 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26" podUID="1187469f-b925-4823-8e9e-f3721c8b299b" Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.679292 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vgbf2" event={"ID":"67b596ac-2e44-42b4-99b2-fd1c8712aaed","Type":"ContainerStarted","Data":"fafaeb73b0211e14fbd30599afd85568268fe0569dcd6cde32b99c8c4947165b"} Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.680333 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt" event={"ID":"6e2eb455-f2ff-40c0-9c26-67c675c1102f","Type":"ContainerStarted","Data":"25091fd7084cc237b33a68183315a54a074d47c4ee28e5b733bd96e96890e6a8"} Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.681271 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt" podUID="6e2eb455-f2ff-40c0-9c26-67c675c1102f" Mar 18 14:18:39 crc kubenswrapper[4756]: I0318 14:18:39.770502 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m846d\" (UID: \"d48d57d1-c314-4a7c-bd51-26ec5cfebbd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.770738 4756 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 14:18:39 crc kubenswrapper[4756]: E0318 14:18:39.770825 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert podName:d48d57d1-c314-4a7c-bd51-26ec5cfebbd1 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:41.770803982 +0000 UTC m=+1123.085221967 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-m846d" (UID: "d48d57d1-c314-4a7c-bd51-26ec5cfebbd1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 14:18:40 crc kubenswrapper[4756]: I0318 14:18:40.282858 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:40 crc kubenswrapper[4756]: I0318 14:18:40.282948 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:40 crc kubenswrapper[4756]: E0318 14:18:40.283041 4756 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 14:18:40 crc kubenswrapper[4756]: E0318 14:18:40.283057 4756 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 14:18:40 crc kubenswrapper[4756]: E0318 14:18:40.283110 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs podName:d78c665d-9f25-4d44-80eb-12324454e435 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:42.283092681 +0000 UTC m=+1123.597510656 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs") pod "openstack-operator-controller-manager-f84d7fd4f-tksnb" (UID: "d78c665d-9f25-4d44-80eb-12324454e435") : secret "metrics-server-cert" not found Mar 18 14:18:40 crc kubenswrapper[4756]: E0318 14:18:40.283160 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs podName:d78c665d-9f25-4d44-80eb-12324454e435 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:42.283153423 +0000 UTC m=+1123.597571398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs") pod "openstack-operator-controller-manager-f84d7fd4f-tksnb" (UID: "d78c665d-9f25-4d44-80eb-12324454e435") : secret "webhook-server-cert" not found Mar 18 14:18:40 crc kubenswrapper[4756]: I0318 14:18:40.692331 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p" event={"ID":"d87d4d63-23ac-4366-88d2-5d8803a7322e","Type":"ContainerStarted","Data":"bbacc75ed4a5720023b114192ae80cd19bc49802419157763d33ec3093468348"} Mar 18 14:18:40 crc kubenswrapper[4756]: I0318 14:18:40.693471 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn" event={"ID":"726ef03d-4a13-449c-8866-c6f5ee240873","Type":"ContainerStarted","Data":"3b08d8582c8cd4b05db8c2299c2472ef66f6052a4f4467aadfaebba820a95309"} Mar 18 14:18:40 crc kubenswrapper[4756]: E0318 14:18:40.694197 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p" podUID="d87d4d63-23ac-4366-88d2-5d8803a7322e" Mar 18 14:18:40 crc kubenswrapper[4756]: E0318 14:18:40.699435 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.194:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn" podUID="726ef03d-4a13-449c-8866-c6f5ee240873" Mar 18 14:18:40 crc kubenswrapper[4756]: I0318 14:18:40.702795 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp" event={"ID":"c37f78a1-6298-4f65-9dac-6f597ed75a31","Type":"ContainerStarted","Data":"0262984a3c94fc2edbb6b329293b9e7f5d6ff4afc3415b2b3ec97d29ed755882"} Mar 18 14:18:40 crc kubenswrapper[4756]: E0318 14:18:40.708891 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp" podUID="c37f78a1-6298-4f65-9dac-6f597ed75a31" Mar 18 14:18:40 crc kubenswrapper[4756]: I0318 14:18:40.714162 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g" event={"ID":"0a54e6d2-b6e2-4808-8c1d-e12b975702cb","Type":"ContainerStarted","Data":"b5b2c39286749a16ff5d1a00e554d2be38adff9523b6ea08a951b2b84e43c7d0"} Mar 18 14:18:40 crc kubenswrapper[4756]: E0318 14:18:40.716216 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g" podUID="0a54e6d2-b6e2-4808-8c1d-e12b975702cb" Mar 18 14:18:40 crc kubenswrapper[4756]: I0318 14:18:40.718495 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5" event={"ID":"a632ee17-dd9e-4ec8-b281-7224395bd2fe","Type":"ContainerStarted","Data":"06bd9097df0c3471e7fe2920534e67e8b1b81100168b6d72de77e951f10e0c2b"} Mar 18 14:18:40 crc kubenswrapper[4756]: E0318 14:18:40.720567 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26" podUID="1187469f-b925-4823-8e9e-f3721c8b299b" Mar 18 14:18:40 crc kubenswrapper[4756]: E0318 14:18:40.720912 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7" podUID="f5e9df6f-3f17-4a97-a335-332fb636f9dd" Mar 18 14:18:40 crc kubenswrapper[4756]: E0318 14:18:40.721185 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5" podUID="a632ee17-dd9e-4ec8-b281-7224395bd2fe" Mar 18 14:18:40 crc kubenswrapper[4756]: E0318 14:18:40.721776 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt" podUID="6e2eb455-f2ff-40c0-9c26-67c675c1102f" Mar 18 14:18:41 crc kubenswrapper[4756]: I0318 14:18:41.512172 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-9lwv9\" (UID: \"ba85373b-8f2d-4f13-8ea7-0648b49074da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:18:41 crc kubenswrapper[4756]: E0318 14:18:41.512350 4756 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 14:18:41 crc kubenswrapper[4756]: E0318 14:18:41.512421 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert podName:ba85373b-8f2d-4f13-8ea7-0648b49074da nodeName:}" failed. No retries permitted until 2026-03-18 14:18:45.512404441 +0000 UTC m=+1126.826822416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert") pod "infra-operator-controller-manager-7b9c774f96-9lwv9" (UID: "ba85373b-8f2d-4f13-8ea7-0648b49074da") : secret "infra-operator-webhook-server-cert" not found Mar 18 14:18:41 crc kubenswrapper[4756]: E0318 14:18:41.728691 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p" podUID="d87d4d63-23ac-4366-88d2-5d8803a7322e" Mar 18 14:18:41 crc kubenswrapper[4756]: E0318 14:18:41.729498 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.194:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn" podUID="726ef03d-4a13-449c-8866-c6f5ee240873" Mar 18 14:18:41 crc kubenswrapper[4756]: E0318 14:18:41.729594 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g" podUID="0a54e6d2-b6e2-4808-8c1d-e12b975702cb" Mar 18 14:18:41 crc kubenswrapper[4756]: E0318 14:18:41.729802 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5" podUID="a632ee17-dd9e-4ec8-b281-7224395bd2fe" Mar 18 14:18:41 crc kubenswrapper[4756]: E0318 14:18:41.735972 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp" podUID="c37f78a1-6298-4f65-9dac-6f597ed75a31" Mar 18 14:18:41 crc kubenswrapper[4756]: I0318 14:18:41.817244 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m846d\" (UID: \"d48d57d1-c314-4a7c-bd51-26ec5cfebbd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:18:41 crc kubenswrapper[4756]: E0318 14:18:41.817770 4756 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 14:18:41 crc kubenswrapper[4756]: E0318 14:18:41.817819 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert podName:d48d57d1-c314-4a7c-bd51-26ec5cfebbd1 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:45.817803001 +0000 UTC m=+1127.132220976 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-m846d" (UID: "d48d57d1-c314-4a7c-bd51-26ec5cfebbd1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 14:18:42 crc kubenswrapper[4756]: I0318 14:18:42.324187 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:42 crc kubenswrapper[4756]: I0318 14:18:42.324283 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:42 crc kubenswrapper[4756]: E0318 14:18:42.324446 4756 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 14:18:42 crc kubenswrapper[4756]: E0318 14:18:42.324492 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs podName:d78c665d-9f25-4d44-80eb-12324454e435 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:46.324477959 +0000 UTC m=+1127.638895934 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs") pod "openstack-operator-controller-manager-f84d7fd4f-tksnb" (UID: "d78c665d-9f25-4d44-80eb-12324454e435") : secret "webhook-server-cert" not found Mar 18 14:18:42 crc kubenswrapper[4756]: E0318 14:18:42.324534 4756 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 14:18:42 crc kubenswrapper[4756]: E0318 14:18:42.324552 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs podName:d78c665d-9f25-4d44-80eb-12324454e435 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:46.324545231 +0000 UTC m=+1127.638963206 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs") pod "openstack-operator-controller-manager-f84d7fd4f-tksnb" (UID: "d78c665d-9f25-4d44-80eb-12324454e435") : secret "metrics-server-cert" not found Mar 18 14:18:45 crc kubenswrapper[4756]: I0318 14:18:45.568544 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-9lwv9\" (UID: \"ba85373b-8f2d-4f13-8ea7-0648b49074da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:18:45 crc kubenswrapper[4756]: E0318 14:18:45.569140 4756 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 14:18:45 crc kubenswrapper[4756]: E0318 14:18:45.569374 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert podName:ba85373b-8f2d-4f13-8ea7-0648b49074da nodeName:}" failed. No retries permitted until 2026-03-18 14:18:53.569353779 +0000 UTC m=+1134.883771774 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert") pod "infra-operator-controller-manager-7b9c774f96-9lwv9" (UID: "ba85373b-8f2d-4f13-8ea7-0648b49074da") : secret "infra-operator-webhook-server-cert" not found Mar 18 14:18:45 crc kubenswrapper[4756]: I0318 14:18:45.874217 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m846d\" (UID: \"d48d57d1-c314-4a7c-bd51-26ec5cfebbd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:18:45 crc kubenswrapper[4756]: E0318 14:18:45.874384 4756 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 14:18:45 crc kubenswrapper[4756]: E0318 14:18:45.874452 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert podName:d48d57d1-c314-4a7c-bd51-26ec5cfebbd1 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:53.87443257 +0000 UTC m=+1135.188850545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-m846d" (UID: "d48d57d1-c314-4a7c-bd51-26ec5cfebbd1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 14:18:46 crc kubenswrapper[4756]: I0318 14:18:46.380778 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:46 crc kubenswrapper[4756]: I0318 14:18:46.380886 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:46 crc kubenswrapper[4756]: E0318 14:18:46.381022 4756 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 14:18:46 crc kubenswrapper[4756]: E0318 14:18:46.381072 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs podName:d78c665d-9f25-4d44-80eb-12324454e435 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:54.381057277 +0000 UTC m=+1135.695475262 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs") pod "openstack-operator-controller-manager-f84d7fd4f-tksnb" (UID: "d78c665d-9f25-4d44-80eb-12324454e435") : secret "webhook-server-cert" not found Mar 18 14:18:46 crc kubenswrapper[4756]: E0318 14:18:46.381393 4756 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 14:18:46 crc kubenswrapper[4756]: E0318 14:18:46.381484 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs podName:d78c665d-9f25-4d44-80eb-12324454e435 nodeName:}" failed. No retries permitted until 2026-03-18 14:18:54.381462488 +0000 UTC m=+1135.695880473 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs") pod "openstack-operator-controller-manager-f84d7fd4f-tksnb" (UID: "d78c665d-9f25-4d44-80eb-12324454e435") : secret "metrics-server-cert" not found Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.785514 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-br68j" event={"ID":"2efade7a-e4d1-43ab-a237-139b25a4163c","Type":"ContainerStarted","Data":"dbde28999d7ab87273f921f917f972befbf707a9a00f9659010d371be3a81c70"} Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.785806 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-br68j" Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.788006 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-k6gpj" event={"ID":"c469b9e9-509c-4265-bc75-3d80d75c4365","Type":"ContainerStarted","Data":"66d864464e0e045cc34d25c211ea524de2d1c945960e90a3ee4a4aabd1a4c44c"} Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.788134 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-k6gpj" Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.790349 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-p6hhx" event={"ID":"aab73c06-7468-4302-ab88-6c91308ca2ac","Type":"ContainerStarted","Data":"f3cdfd974ef18b5791d0b5f5207ce03e2b4ba1346910dc63d072f5446e74a6d8"} Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.790441 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-p6hhx" Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.792614 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wx92j" event={"ID":"400b73ce-6d8f-4392-b47b-fef88e8452bd","Type":"ContainerStarted","Data":"850bb32d766aa3a887d37349b121b25d6e1df1c82cca5b09c8587f09fdb96375"} Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.792990 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wx92j" Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.798811 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-ggbzf" event={"ID":"e377c7f8-bd46-4193-a288-91f593cc5a25","Type":"ContainerStarted","Data":"6b8b0f3feb6dd0209a7561361c4b3d1ae60f0621e3140e6bc9d991cc09664a87"} Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.799290 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-ggbzf" Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.800891 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mskn7" event={"ID":"1a62c450-4f9c-4c7b-a864-2600eb6c8589","Type":"ContainerStarted","Data":"e2f5dcd866926ccef212175a92a896ef79a98d9a90e701e53e66604296c9c3cc"} Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.801210 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mskn7" Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.802631 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vgbf2" event={"ID":"67b596ac-2e44-42b4-99b2-fd1c8712aaed","Type":"ContainerStarted","Data":"4dd4c4322d91388167aad63922637376989a0ca187d8787587d6cb748fd1b056"} Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.806381 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vgbf2" Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.857749 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fzxth" event={"ID":"5c617272-2409-489c-8093-3c943a117a23","Type":"ContainerStarted","Data":"967d406f778b663314eab452170a8d6bf47be5f06daa770e1cebea95eeba21d3"} Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.857793 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fzxth" Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.866872 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bz77b" event={"ID":"dfcb60c2-5eba-4f32-a738-8c79d6c36df7","Type":"ContainerStarted","Data":"6b661a5b97d6c28eaf39d872e350588224bea2c16f1891542c7421a854c65f9d"} Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.867398 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bz77b" Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.889508 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-gmmtd" event={"ID":"a5afd6e2-d647-4165-a9ef-506d7e16173c","Type":"ContainerStarted","Data":"5749b278e109911f1bf09e8b1f0033f2b0e01d81b23875a08397e2596a6f536d"} Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.890314 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-gmmtd" Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.897143 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mhqlh" event={"ID":"d05621f7-0f1e-4b58-b016-6cdb083fed42","Type":"ContainerStarted","Data":"ba00d5b7a2056626a68f20b13fedd7be7bdef654cefe6d2344165249b82aa6cc"} Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.897195 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mhqlh" Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.900742 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-fk8pc" event={"ID":"83526c83-3dbe-42b4-a101-7ce37495b4cf","Type":"ContainerStarted","Data":"7ec086b3e6f46a1e2e93209540f1bfbd570379b473b8a5d3d3ebc899b4bd067b"} Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.901519 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-fk8pc" Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.907606 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-br68j" podStartSLOduration=3.606344263 podStartE2EDuration="12.907583306s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.62232194 +0000 UTC m=+1120.936739915" lastFinishedPulling="2026-03-18 14:18:48.923560943 +0000 UTC m=+1130.237978958" observedRunningTime="2026-03-18 14:18:49.858640723 +0000 UTC m=+1131.173058698" watchObservedRunningTime="2026-03-18 14:18:49.907583306 +0000 UTC m=+1131.222001301" Mar 18 14:18:49 crc kubenswrapper[4756]: I0318 14:18:49.918820 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vgbf2" podStartSLOduration=3.340418439 podStartE2EDuration="12.918795869s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.384521336 +0000 UTC m=+1120.698939311" lastFinishedPulling="2026-03-18 14:18:48.962898766 +0000 UTC m=+1130.277316741" observedRunningTime="2026-03-18 14:18:49.911652665 +0000 UTC m=+1131.226070640" watchObservedRunningTime="2026-03-18 14:18:49.918795869 +0000 UTC m=+1131.233213844" Mar 18 14:18:50 crc kubenswrapper[4756]: I0318 14:18:50.095912 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fzxth" podStartSLOduration=3.75016428 podStartE2EDuration="13.095894573s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.616033591 +0000 UTC m=+1120.930451566" lastFinishedPulling="2026-03-18 14:18:48.961763884 +0000 UTC m=+1130.276181859" observedRunningTime="2026-03-18 14:18:50.08431409 +0000 UTC m=+1131.398732065" watchObservedRunningTime="2026-03-18 14:18:50.095894573 +0000 UTC m=+1131.410312548" Mar 18 14:18:50 crc kubenswrapper[4756]: I0318 14:18:50.096271 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-k6gpj" podStartSLOduration=3.767854647 podStartE2EDuration="13.096267243s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.615546848 +0000 UTC m=+1120.929964823" lastFinishedPulling="2026-03-18 14:18:48.943959394 +0000 UTC m=+1130.258377419" observedRunningTime="2026-03-18 14:18:49.983511277 +0000 UTC m=+1131.297929252" watchObservedRunningTime="2026-03-18 14:18:50.096267243 +0000 UTC m=+1131.410685218" Mar 18 14:18:50 crc kubenswrapper[4756]: I0318 14:18:50.180675 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wx92j" podStartSLOduration=3.838964029 podStartE2EDuration="13.180660002s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.586314299 +0000 UTC m=+1120.900732274" lastFinishedPulling="2026-03-18 14:18:48.928010232 +0000 UTC m=+1130.242428247" observedRunningTime="2026-03-18 14:18:50.180044066 +0000 UTC m=+1131.494462051" watchObservedRunningTime="2026-03-18 14:18:50.180660002 +0000 UTC m=+1131.495077977" Mar 18 14:18:50 crc kubenswrapper[4756]: I0318 14:18:50.229670 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-p6hhx" podStartSLOduration=3.191856627 podStartE2EDuration="13.229651956s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:38.909554456 +0000 UTC m=+1120.223972431" lastFinishedPulling="2026-03-18 14:18:48.947349785 +0000 UTC m=+1130.261767760" observedRunningTime="2026-03-18 14:18:50.222309318 +0000 UTC m=+1131.536727303" watchObservedRunningTime="2026-03-18 14:18:50.229651956 +0000 UTC m=+1131.544069931" Mar 18 14:18:50 crc kubenswrapper[4756]: I0318 14:18:50.272937 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-ggbzf" podStartSLOduration=3.697960069 podStartE2EDuration="13.272917025s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.384218178 +0000 UTC m=+1120.698636153" lastFinishedPulling="2026-03-18 14:18:48.959175144 +0000 UTC m=+1130.273593109" observedRunningTime="2026-03-18 14:18:50.254816306 +0000 UTC m=+1131.569234291" watchObservedRunningTime="2026-03-18 14:18:50.272917025 +0000 UTC m=+1131.587335000" Mar 18 14:18:50 crc kubenswrapper[4756]: I0318 14:18:50.301994 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mskn7" podStartSLOduration=3.740949231 podStartE2EDuration="13.30197876s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.39091109 +0000 UTC m=+1120.705329055" lastFinishedPulling="2026-03-18 14:18:48.951940609 +0000 UTC m=+1130.266358584" observedRunningTime="2026-03-18 14:18:50.299208896 +0000 UTC m=+1131.613626861" watchObservedRunningTime="2026-03-18 14:18:50.30197876 +0000 UTC m=+1131.616396735" Mar 18 14:18:50 crc kubenswrapper[4756]: I0318 14:18:50.363242 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bz77b" podStartSLOduration=4.059915627 podStartE2EDuration="13.363226395s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.620254015 +0000 UTC m=+1120.934671990" lastFinishedPulling="2026-03-18 14:18:48.923564783 +0000 UTC m=+1130.237982758" observedRunningTime="2026-03-18 14:18:50.358017234 +0000 UTC m=+1131.672435209" watchObservedRunningTime="2026-03-18 14:18:50.363226395 +0000 UTC m=+1131.677644370" Mar 18 14:18:50 crc kubenswrapper[4756]: I0318 14:18:50.401480 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-gmmtd" podStartSLOduration=3.767147407 podStartE2EDuration="13.401463287s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.289981983 +0000 UTC m=+1120.604399958" lastFinishedPulling="2026-03-18 14:18:48.924297863 +0000 UTC m=+1130.238715838" observedRunningTime="2026-03-18 14:18:50.400533763 +0000 UTC m=+1131.714951738" watchObservedRunningTime="2026-03-18 14:18:50.401463287 +0000 UTC m=+1131.715881262" Mar 18 14:18:50 crc kubenswrapper[4756]: I0318 14:18:50.426616 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mhqlh" podStartSLOduration=3.887975202 podStartE2EDuration="13.426596987s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.384863286 +0000 UTC m=+1120.699281261" lastFinishedPulling="2026-03-18 14:18:48.923485081 +0000 UTC m=+1130.237903046" observedRunningTime="2026-03-18 14:18:50.426308929 +0000 UTC m=+1131.740726894" watchObservedRunningTime="2026-03-18 14:18:50.426596987 +0000 UTC m=+1131.741014962" Mar 18 14:18:50 crc kubenswrapper[4756]: I0318 14:18:50.454382 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-fk8pc" podStartSLOduration=4.096959657 podStartE2EDuration="13.454366307s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.593942344 +0000 UTC m=+1120.908360319" lastFinishedPulling="2026-03-18 14:18:48.951348984 +0000 UTC m=+1130.265766969" observedRunningTime="2026-03-18 14:18:50.449501875 +0000 UTC m=+1131.763919850" watchObservedRunningTime="2026-03-18 14:18:50.454366307 +0000 UTC m=+1131.768784282" Mar 18 14:18:53 crc kubenswrapper[4756]: I0318 14:18:53.608456 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-9lwv9\" (UID: \"ba85373b-8f2d-4f13-8ea7-0648b49074da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:18:53 crc kubenswrapper[4756]: E0318 14:18:53.608618 4756 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 14:18:53 crc kubenswrapper[4756]: E0318 14:18:53.609097 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert podName:ba85373b-8f2d-4f13-8ea7-0648b49074da nodeName:}" failed. No retries permitted until 2026-03-18 14:19:09.609082381 +0000 UTC m=+1150.923500356 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert") pod "infra-operator-controller-manager-7b9c774f96-9lwv9" (UID: "ba85373b-8f2d-4f13-8ea7-0648b49074da") : secret "infra-operator-webhook-server-cert" not found Mar 18 14:18:53 crc kubenswrapper[4756]: I0318 14:18:53.913410 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m846d\" (UID: \"d48d57d1-c314-4a7c-bd51-26ec5cfebbd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:18:53 crc kubenswrapper[4756]: I0318 14:18:53.922764 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d48d57d1-c314-4a7c-bd51-26ec5cfebbd1-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m846d\" (UID: \"d48d57d1-c314-4a7c-bd51-26ec5cfebbd1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:18:53 crc kubenswrapper[4756]: I0318 14:18:53.928841 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt" event={"ID":"6e2eb455-f2ff-40c0-9c26-67c675c1102f","Type":"ContainerStarted","Data":"d8e09c7ef2397321101b1d8090b8e0dd429eae4be7382e200178c23e99dcf25c"} Mar 18 14:18:53 crc kubenswrapper[4756]: I0318 14:18:53.929098 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt" Mar 18 14:18:53 crc kubenswrapper[4756]: I0318 14:18:53.945072 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt" podStartSLOduration=3.281353064 podStartE2EDuration="16.945055197s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.638988771 +0000 UTC m=+1120.953406746" lastFinishedPulling="2026-03-18 14:18:53.302690874 +0000 UTC m=+1134.617108879" observedRunningTime="2026-03-18 14:18:53.944333078 +0000 UTC m=+1135.258751073" watchObservedRunningTime="2026-03-18 14:18:53.945055197 +0000 UTC m=+1135.259473172" Mar 18 14:18:54 crc kubenswrapper[4756]: I0318 14:18:54.133607 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:18:54 crc kubenswrapper[4756]: I0318 14:18:54.422935 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:54 crc kubenswrapper[4756]: I0318 14:18:54.423114 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:54 crc kubenswrapper[4756]: I0318 14:18:54.428158 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-metrics-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:54 crc kubenswrapper[4756]: I0318 14:18:54.428776 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d78c665d-9f25-4d44-80eb-12324454e435-webhook-certs\") pod \"openstack-operator-controller-manager-f84d7fd4f-tksnb\" (UID: \"d78c665d-9f25-4d44-80eb-12324454e435\") " pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:54 crc kubenswrapper[4756]: I0318 14:18:54.468896 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:54 crc kubenswrapper[4756]: I0318 14:18:54.943286 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g" event={"ID":"0a54e6d2-b6e2-4808-8c1d-e12b975702cb","Type":"ContainerStarted","Data":"f0ef6db45b031962bae610019a151467391069e3bbe1bc5fa872c5964f65f5cb"} Mar 18 14:18:54 crc kubenswrapper[4756]: I0318 14:18:54.943881 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g" Mar 18 14:18:54 crc kubenswrapper[4756]: I0318 14:18:54.961235 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g" podStartSLOduration=1.918485172 podStartE2EDuration="16.961193468s" podCreationTimestamp="2026-03-18 14:18:38 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.648261132 +0000 UTC m=+1120.962679107" lastFinishedPulling="2026-03-18 14:18:54.690969428 +0000 UTC m=+1136.005387403" observedRunningTime="2026-03-18 14:18:54.956855531 +0000 UTC m=+1136.271273516" watchObservedRunningTime="2026-03-18 14:18:54.961193468 +0000 UTC m=+1136.275611463" Mar 18 14:18:54 crc kubenswrapper[4756]: I0318 14:18:54.976698 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d"] Mar 18 14:18:54 crc kubenswrapper[4756]: W0318 14:18:54.980202 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd48d57d1_c314_4a7c_bd51_26ec5cfebbd1.slice/crio-c287311e9163a4f034b962d9e15e19399b89f15ae7ef9359b206a9db13c9df11 WatchSource:0}: Error finding container c287311e9163a4f034b962d9e15e19399b89f15ae7ef9359b206a9db13c9df11: Status 404 returned error can't find the container with id c287311e9163a4f034b962d9e15e19399b89f15ae7ef9359b206a9db13c9df11 Mar 18 14:18:55 crc kubenswrapper[4756]: I0318 14:18:55.128933 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb"] Mar 18 14:18:55 crc kubenswrapper[4756]: W0318 14:18:55.143666 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd78c665d_9f25_4d44_80eb_12324454e435.slice/crio-851942f9bff157548888c61dfa7a0db05660bf917b763b4a532c72f33c0083d5 WatchSource:0}: Error finding container 851942f9bff157548888c61dfa7a0db05660bf917b763b4a532c72f33c0083d5: Status 404 returned error can't find the container with id 851942f9bff157548888c61dfa7a0db05660bf917b763b4a532c72f33c0083d5 Mar 18 14:18:55 crc kubenswrapper[4756]: I0318 14:18:55.956154 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" event={"ID":"d48d57d1-c314-4a7c-bd51-26ec5cfebbd1","Type":"ContainerStarted","Data":"c287311e9163a4f034b962d9e15e19399b89f15ae7ef9359b206a9db13c9df11"} Mar 18 14:18:55 crc kubenswrapper[4756]: I0318 14:18:55.958490 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" event={"ID":"d78c665d-9f25-4d44-80eb-12324454e435","Type":"ContainerStarted","Data":"d85610ffbfc25748c4a670fa4ea0eed2f79fb7bed87f90971e93a3d3aa9e4982"} Mar 18 14:18:55 crc kubenswrapper[4756]: I0318 14:18:55.958554 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" event={"ID":"d78c665d-9f25-4d44-80eb-12324454e435","Type":"ContainerStarted","Data":"851942f9bff157548888c61dfa7a0db05660bf917b763b4a532c72f33c0083d5"} Mar 18 14:18:55 crc kubenswrapper[4756]: I0318 14:18:55.958742 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:18:56 crc kubenswrapper[4756]: I0318 14:18:55.995513 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" podStartSLOduration=17.99549378 podStartE2EDuration="17.99549378s" podCreationTimestamp="2026-03-18 14:18:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:18:55.988242193 +0000 UTC m=+1137.302660178" watchObservedRunningTime="2026-03-18 14:18:55.99549378 +0000 UTC m=+1137.309911755" Mar 18 14:18:56 crc kubenswrapper[4756]: I0318 14:18:56.967467 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7" event={"ID":"f5e9df6f-3f17-4a97-a335-332fb636f9dd","Type":"ContainerStarted","Data":"5416cdd41c949da19dafdc9c8cb2b0024c1cda080445bf960c22bcfb24e883a1"} Mar 18 14:18:56 crc kubenswrapper[4756]: I0318 14:18:56.967968 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7" Mar 18 14:18:56 crc kubenswrapper[4756]: I0318 14:18:56.986818 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7" podStartSLOduration=3.4314749 podStartE2EDuration="19.98680171s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.639192226 +0000 UTC m=+1120.953610201" lastFinishedPulling="2026-03-18 14:18:56.194519036 +0000 UTC m=+1137.508937011" observedRunningTime="2026-03-18 14:18:56.980286534 +0000 UTC m=+1138.294704529" watchObservedRunningTime="2026-03-18 14:18:56.98680171 +0000 UTC m=+1138.301219685" Mar 18 14:18:57 crc kubenswrapper[4756]: I0318 14:18:57.927045 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-gmmtd" Mar 18 14:18:57 crc kubenswrapper[4756]: I0318 14:18:57.950026 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mskn7" Mar 18 14:18:57 crc kubenswrapper[4756]: I0318 14:18:57.988655 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mhqlh" Mar 18 14:18:58 crc kubenswrapper[4756]: I0318 14:18:58.001088 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-p6hhx" Mar 18 14:18:58 crc kubenswrapper[4756]: I0318 14:18:58.046443 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-k87mt" Mar 18 14:18:58 crc kubenswrapper[4756]: I0318 14:18:58.060914 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-vgbf2" Mar 18 14:18:58 crc kubenswrapper[4756]: I0318 14:18:58.106626 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-fk8pc" Mar 18 14:18:58 crc kubenswrapper[4756]: I0318 14:18:58.142676 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-ggbzf" Mar 18 14:18:58 crc kubenswrapper[4756]: I0318 14:18:58.148046 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wx92j" Mar 18 14:18:58 crc kubenswrapper[4756]: I0318 14:18:58.192900 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-br68j" Mar 18 14:18:58 crc kubenswrapper[4756]: I0318 14:18:58.194750 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fzxth" Mar 18 14:18:58 crc kubenswrapper[4756]: I0318 14:18:58.315749 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bz77b" Mar 18 14:18:58 crc kubenswrapper[4756]: I0318 14:18:58.453580 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-k6gpj" Mar 18 14:19:04 crc kubenswrapper[4756]: I0318 14:19:04.479134 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-f84d7fd4f-tksnb" Mar 18 14:19:06 crc kubenswrapper[4756]: I0318 14:19:06.915056 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:19:06 crc kubenswrapper[4756]: I0318 14:19:06.915411 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:19:06 crc kubenswrapper[4756]: I0318 14:19:06.915458 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:19:06 crc kubenswrapper[4756]: I0318 14:19:06.916128 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cdecc63cb2f22e85e1b8370b385518ac960137ad6026c38922b0bcf6a6e1374a"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:19:06 crc kubenswrapper[4756]: I0318 14:19:06.916234 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://cdecc63cb2f22e85e1b8370b385518ac960137ad6026c38922b0bcf6a6e1374a" gracePeriod=600 Mar 18 14:19:08 crc kubenswrapper[4756]: I0318 14:19:08.078269 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="cdecc63cb2f22e85e1b8370b385518ac960137ad6026c38922b0bcf6a6e1374a" exitCode=0 Mar 18 14:19:08 crc kubenswrapper[4756]: I0318 14:19:08.078320 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"cdecc63cb2f22e85e1b8370b385518ac960137ad6026c38922b0bcf6a6e1374a"} Mar 18 14:19:08 crc kubenswrapper[4756]: I0318 14:19:08.078365 4756 scope.go:117] "RemoveContainer" containerID="617eebb4a8c3d04af231bb44e996daa1896f056ada27eee9b25a69c05455bb74" Mar 18 14:19:08 crc kubenswrapper[4756]: I0318 14:19:08.585840 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j9sw7" Mar 18 14:19:08 crc kubenswrapper[4756]: I0318 14:19:08.854461 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-phw4g" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.086757 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn" event={"ID":"726ef03d-4a13-449c-8866-c6f5ee240873","Type":"ContainerStarted","Data":"0b73e9db6f71e2466da9627f681264aa5d5798bd73919a1111343d3e99a9316e"} Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.086919 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.088149 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26" event={"ID":"1187469f-b925-4823-8e9e-f3721c8b299b","Type":"ContainerStarted","Data":"f31f08f493f3d7344b9d2699c854558d9c3d16c235608131c55e599fb29ff00b"} Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.088415 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.089616 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" event={"ID":"d48d57d1-c314-4a7c-bd51-26ec5cfebbd1","Type":"ContainerStarted","Data":"a8e9cdd778200d2040a08fe92862cd4db48feaeb0a069cbfd5576be389571584"} Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.089740 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.091151 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp" event={"ID":"c37f78a1-6298-4f65-9dac-6f597ed75a31","Type":"ContainerStarted","Data":"a51fb3cda8de590e48fc72accfc3b56478e63c22b7372cc60886367c57187c99"} Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.093705 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"9727a0a3407fffdca1e788ab9dbb2c6a316b6b73611747d0e9dff150fec50fa4"} Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.095370 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5" event={"ID":"a632ee17-dd9e-4ec8-b281-7224395bd2fe","Type":"ContainerStarted","Data":"c1b57a7f54b83dc91a004be17070bc8a2044a58eff919258dce34d5306bc642f"} Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.095507 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.096536 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p" event={"ID":"d87d4d63-23ac-4366-88d2-5d8803a7322e","Type":"ContainerStarted","Data":"7b6cb5f04a7781c606614775f62266d4cbb3eb4a0b96a20ad1fd7f524cdbefda"} Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.096672 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.112839 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn" podStartSLOduration=2.495768196 podStartE2EDuration="31.112820442s" podCreationTimestamp="2026-03-18 14:18:38 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.65931011 +0000 UTC m=+1120.973728085" lastFinishedPulling="2026-03-18 14:19:08.276362356 +0000 UTC m=+1149.590780331" observedRunningTime="2026-03-18 14:19:09.104513589 +0000 UTC m=+1150.418931564" watchObservedRunningTime="2026-03-18 14:19:09.112820442 +0000 UTC m=+1150.427238417" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.133596 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sd7vp" podStartSLOduration=2.496555028 podStartE2EDuration="31.133581624s" podCreationTimestamp="2026-03-18 14:18:38 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.66710459 +0000 UTC m=+1120.981522565" lastFinishedPulling="2026-03-18 14:19:08.304131186 +0000 UTC m=+1149.618549161" observedRunningTime="2026-03-18 14:19:09.130618144 +0000 UTC m=+1150.445036119" watchObservedRunningTime="2026-03-18 14:19:09.133581624 +0000 UTC m=+1150.447999599" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.141503 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5" podStartSLOduration=3.493588699 podStartE2EDuration="32.141484487s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.656988628 +0000 UTC m=+1120.971406603" lastFinishedPulling="2026-03-18 14:19:08.304884406 +0000 UTC m=+1149.619302391" observedRunningTime="2026-03-18 14:19:09.141243491 +0000 UTC m=+1150.455661476" watchObservedRunningTime="2026-03-18 14:19:09.141484487 +0000 UTC m=+1150.455902462" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.152863 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p" podStartSLOduration=3.507158435 podStartE2EDuration="32.152848464s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.659182037 +0000 UTC m=+1120.973600012" lastFinishedPulling="2026-03-18 14:19:08.304872066 +0000 UTC m=+1149.619290041" observedRunningTime="2026-03-18 14:19:09.152182997 +0000 UTC m=+1150.466600972" watchObservedRunningTime="2026-03-18 14:19:09.152848464 +0000 UTC m=+1150.467266439" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.174999 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" podStartSLOduration=18.870918294 podStartE2EDuration="32.174983592s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:18:54.982295878 +0000 UTC m=+1136.296713853" lastFinishedPulling="2026-03-18 14:19:08.286361176 +0000 UTC m=+1149.600779151" observedRunningTime="2026-03-18 14:19:09.172139656 +0000 UTC m=+1150.486557631" watchObservedRunningTime="2026-03-18 14:19:09.174983592 +0000 UTC m=+1150.489401557" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.212193 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26" podStartSLOduration=2.612616504 podStartE2EDuration="31.212169847s" podCreationTimestamp="2026-03-18 14:18:38 +0000 UTC" firstStartedPulling="2026-03-18 14:18:39.629228718 +0000 UTC m=+1120.943646693" lastFinishedPulling="2026-03-18 14:19:08.228782061 +0000 UTC m=+1149.543200036" observedRunningTime="2026-03-18 14:19:09.205582149 +0000 UTC m=+1150.520000114" watchObservedRunningTime="2026-03-18 14:19:09.212169847 +0000 UTC m=+1150.526587892" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.662721 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-9lwv9\" (UID: \"ba85373b-8f2d-4f13-8ea7-0648b49074da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.671406 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba85373b-8f2d-4f13-8ea7-0648b49074da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-9lwv9\" (UID: \"ba85373b-8f2d-4f13-8ea7-0648b49074da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.907200 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4bfzt" Mar 18 14:19:09 crc kubenswrapper[4756]: I0318 14:19:09.913990 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:19:10 crc kubenswrapper[4756]: I0318 14:19:10.484888 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9"] Mar 18 14:19:11 crc kubenswrapper[4756]: I0318 14:19:11.109496 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" event={"ID":"ba85373b-8f2d-4f13-8ea7-0648b49074da","Type":"ContainerStarted","Data":"fd494cc1c71bdcd0e2db36aeb797c3b6c0054b98facf3bf49fafbca7bdf95879"} Mar 18 14:19:13 crc kubenswrapper[4756]: I0318 14:19:13.124854 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" event={"ID":"ba85373b-8f2d-4f13-8ea7-0648b49074da","Type":"ContainerStarted","Data":"23033d234ba9e3b3cbd599e3665abfc14197856c3a07a577b36ea171a2b0fe6b"} Mar 18 14:19:13 crc kubenswrapper[4756]: I0318 14:19:13.125511 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:19:13 crc kubenswrapper[4756]: I0318 14:19:13.147048 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" podStartSLOduration=34.431420659 podStartE2EDuration="36.147027656s" podCreationTimestamp="2026-03-18 14:18:37 +0000 UTC" firstStartedPulling="2026-03-18 14:19:10.487794237 +0000 UTC m=+1151.802212212" lastFinishedPulling="2026-03-18 14:19:12.203401234 +0000 UTC m=+1153.517819209" observedRunningTime="2026-03-18 14:19:13.143765918 +0000 UTC m=+1154.458183913" watchObservedRunningTime="2026-03-18 14:19:13.147027656 +0000 UTC m=+1154.461445631" Mar 18 14:19:14 crc kubenswrapper[4756]: I0318 14:19:14.141090 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m846d" Mar 18 14:19:18 crc kubenswrapper[4756]: I0318 14:19:18.343214 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwrx5" Mar 18 14:19:18 crc kubenswrapper[4756]: I0318 14:19:18.769799 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-4pz7p" Mar 18 14:19:18 crc kubenswrapper[4756]: I0318 14:19:18.810010 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5b79d7bc79-kn5zn" Mar 18 14:19:18 crc kubenswrapper[4756]: I0318 14:19:18.819488 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-wsh26" Mar 18 14:19:19 crc kubenswrapper[4756]: I0318 14:19:19.920218 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-9lwv9" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.223917 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6wghm"] Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.225507 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6wghm" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.230637 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.230727 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.231011 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.232814 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8b957" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.245393 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6wghm"] Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.289757 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fvc2v"] Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.290924 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.296199 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.301636 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fvc2v"] Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.338682 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7533a2b3-c9f8-4594-ba41-f200d085e0de-config\") pod \"dnsmasq-dns-675f4bcbfc-6wghm\" (UID: \"7533a2b3-c9f8-4594-ba41-f200d085e0de\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6wghm" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.338746 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twl8f\" (UniqueName: \"kubernetes.io/projected/7533a2b3-c9f8-4594-ba41-f200d085e0de-kube-api-access-twl8f\") pod \"dnsmasq-dns-675f4bcbfc-6wghm\" (UID: \"7533a2b3-c9f8-4594-ba41-f200d085e0de\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6wghm" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.440247 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plltk\" (UniqueName: \"kubernetes.io/projected/b8398a92-1446-4d7f-b776-ba94565d42ee-kube-api-access-plltk\") pod \"dnsmasq-dns-78dd6ddcc-fvc2v\" (UID: \"b8398a92-1446-4d7f-b776-ba94565d42ee\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.440330 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8398a92-1446-4d7f-b776-ba94565d42ee-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fvc2v\" (UID: \"b8398a92-1446-4d7f-b776-ba94565d42ee\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.440357 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8398a92-1446-4d7f-b776-ba94565d42ee-config\") pod \"dnsmasq-dns-78dd6ddcc-fvc2v\" (UID: \"b8398a92-1446-4d7f-b776-ba94565d42ee\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.440392 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7533a2b3-c9f8-4594-ba41-f200d085e0de-config\") pod \"dnsmasq-dns-675f4bcbfc-6wghm\" (UID: \"7533a2b3-c9f8-4594-ba41-f200d085e0de\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6wghm" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.440442 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twl8f\" (UniqueName: \"kubernetes.io/projected/7533a2b3-c9f8-4594-ba41-f200d085e0de-kube-api-access-twl8f\") pod \"dnsmasq-dns-675f4bcbfc-6wghm\" (UID: \"7533a2b3-c9f8-4594-ba41-f200d085e0de\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6wghm" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.442293 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7533a2b3-c9f8-4594-ba41-f200d085e0de-config\") pod \"dnsmasq-dns-675f4bcbfc-6wghm\" (UID: \"7533a2b3-c9f8-4594-ba41-f200d085e0de\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6wghm" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.461279 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twl8f\" (UniqueName: \"kubernetes.io/projected/7533a2b3-c9f8-4594-ba41-f200d085e0de-kube-api-access-twl8f\") pod \"dnsmasq-dns-675f4bcbfc-6wghm\" (UID: \"7533a2b3-c9f8-4594-ba41-f200d085e0de\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6wghm" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.542093 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8398a92-1446-4d7f-b776-ba94565d42ee-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fvc2v\" (UID: \"b8398a92-1446-4d7f-b776-ba94565d42ee\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.542480 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8398a92-1446-4d7f-b776-ba94565d42ee-config\") pod \"dnsmasq-dns-78dd6ddcc-fvc2v\" (UID: \"b8398a92-1446-4d7f-b776-ba94565d42ee\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.542598 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plltk\" (UniqueName: \"kubernetes.io/projected/b8398a92-1446-4d7f-b776-ba94565d42ee-kube-api-access-plltk\") pod \"dnsmasq-dns-78dd6ddcc-fvc2v\" (UID: \"b8398a92-1446-4d7f-b776-ba94565d42ee\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.543022 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8398a92-1446-4d7f-b776-ba94565d42ee-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fvc2v\" (UID: \"b8398a92-1446-4d7f-b776-ba94565d42ee\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.543850 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8398a92-1446-4d7f-b776-ba94565d42ee-config\") pod \"dnsmasq-dns-78dd6ddcc-fvc2v\" (UID: \"b8398a92-1446-4d7f-b776-ba94565d42ee\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.547025 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6wghm" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.574710 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plltk\" (UniqueName: \"kubernetes.io/projected/b8398a92-1446-4d7f-b776-ba94565d42ee-kube-api-access-plltk\") pod \"dnsmasq-dns-78dd6ddcc-fvc2v\" (UID: \"b8398a92-1446-4d7f-b776-ba94565d42ee\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.624426 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.840141 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6wghm"] Mar 18 14:19:40 crc kubenswrapper[4756]: W0318 14:19:40.848220 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7533a2b3_c9f8_4594_ba41_f200d085e0de.slice/crio-70101f2c9010710b60165b191dac1185721908527682c9e8b45bd8d1690d94d0 WatchSource:0}: Error finding container 70101f2c9010710b60165b191dac1185721908527682c9e8b45bd8d1690d94d0: Status 404 returned error can't find the container with id 70101f2c9010710b60165b191dac1185721908527682c9e8b45bd8d1690d94d0 Mar 18 14:19:40 crc kubenswrapper[4756]: I0318 14:19:40.923761 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fvc2v"] Mar 18 14:19:40 crc kubenswrapper[4756]: W0318 14:19:40.927655 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8398a92_1446_4d7f_b776_ba94565d42ee.slice/crio-fd7e0115ecac29f041ca2684ddf61322042bac96d2cd4c5256448b76fd011d99 WatchSource:0}: Error finding container fd7e0115ecac29f041ca2684ddf61322042bac96d2cd4c5256448b76fd011d99: Status 404 returned error can't find the container with id fd7e0115ecac29f041ca2684ddf61322042bac96d2cd4c5256448b76fd011d99 Mar 18 14:19:41 crc kubenswrapper[4756]: I0318 14:19:41.374205 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" event={"ID":"b8398a92-1446-4d7f-b776-ba94565d42ee","Type":"ContainerStarted","Data":"fd7e0115ecac29f041ca2684ddf61322042bac96d2cd4c5256448b76fd011d99"} Mar 18 14:19:41 crc kubenswrapper[4756]: I0318 14:19:41.377044 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6wghm" event={"ID":"7533a2b3-c9f8-4594-ba41-f200d085e0de","Type":"ContainerStarted","Data":"70101f2c9010710b60165b191dac1185721908527682c9e8b45bd8d1690d94d0"} Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.140565 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6wghm"] Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.168615 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-b7mcw"] Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.171308 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.176401 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-b7mcw"] Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.290176 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8994855c-7079-4927-ac5d-1d875288634d-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-b7mcw\" (UID: \"8994855c-7079-4927-ac5d-1d875288634d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.290298 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzz7v\" (UniqueName: \"kubernetes.io/projected/8994855c-7079-4927-ac5d-1d875288634d-kube-api-access-nzz7v\") pod \"dnsmasq-dns-5ccc8479f9-b7mcw\" (UID: \"8994855c-7079-4927-ac5d-1d875288634d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.290388 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8994855c-7079-4927-ac5d-1d875288634d-config\") pod \"dnsmasq-dns-5ccc8479f9-b7mcw\" (UID: \"8994855c-7079-4927-ac5d-1d875288634d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.392737 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8994855c-7079-4927-ac5d-1d875288634d-config\") pod \"dnsmasq-dns-5ccc8479f9-b7mcw\" (UID: \"8994855c-7079-4927-ac5d-1d875288634d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.392911 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8994855c-7079-4927-ac5d-1d875288634d-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-b7mcw\" (UID: \"8994855c-7079-4927-ac5d-1d875288634d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.394458 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8994855c-7079-4927-ac5d-1d875288634d-config\") pod \"dnsmasq-dns-5ccc8479f9-b7mcw\" (UID: \"8994855c-7079-4927-ac5d-1d875288634d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.400792 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8994855c-7079-4927-ac5d-1d875288634d-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-b7mcw\" (UID: \"8994855c-7079-4927-ac5d-1d875288634d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.400971 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzz7v\" (UniqueName: \"kubernetes.io/projected/8994855c-7079-4927-ac5d-1d875288634d-kube-api-access-nzz7v\") pod \"dnsmasq-dns-5ccc8479f9-b7mcw\" (UID: \"8994855c-7079-4927-ac5d-1d875288634d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.421265 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzz7v\" (UniqueName: \"kubernetes.io/projected/8994855c-7079-4927-ac5d-1d875288634d-kube-api-access-nzz7v\") pod \"dnsmasq-dns-5ccc8479f9-b7mcw\" (UID: \"8994855c-7079-4927-ac5d-1d875288634d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.427827 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fvc2v"] Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.449195 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-d59qb"] Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.451025 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.478824 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-d59qb"] Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.514150 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.609608 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lrqx\" (UniqueName: \"kubernetes.io/projected/7c25b083-0273-4f26-8fb7-ca7e78907b8c-kube-api-access-4lrqx\") pod \"dnsmasq-dns-57d769cc4f-d59qb\" (UID: \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\") " pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.609863 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c25b083-0273-4f26-8fb7-ca7e78907b8c-config\") pod \"dnsmasq-dns-57d769cc4f-d59qb\" (UID: \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\") " pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.609959 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c25b083-0273-4f26-8fb7-ca7e78907b8c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-d59qb\" (UID: \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\") " pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.713142 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c25b083-0273-4f26-8fb7-ca7e78907b8c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-d59qb\" (UID: \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\") " pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.713233 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lrqx\" (UniqueName: \"kubernetes.io/projected/7c25b083-0273-4f26-8fb7-ca7e78907b8c-kube-api-access-4lrqx\") pod \"dnsmasq-dns-57d769cc4f-d59qb\" (UID: \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\") " pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.713262 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c25b083-0273-4f26-8fb7-ca7e78907b8c-config\") pod \"dnsmasq-dns-57d769cc4f-d59qb\" (UID: \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\") " pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.714398 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c25b083-0273-4f26-8fb7-ca7e78907b8c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-d59qb\" (UID: \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\") " pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.714485 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c25b083-0273-4f26-8fb7-ca7e78907b8c-config\") pod \"dnsmasq-dns-57d769cc4f-d59qb\" (UID: \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\") " pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.738318 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lrqx\" (UniqueName: \"kubernetes.io/projected/7c25b083-0273-4f26-8fb7-ca7e78907b8c-kube-api-access-4lrqx\") pod \"dnsmasq-dns-57d769cc4f-d59qb\" (UID: \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\") " pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.792147 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:19:43 crc kubenswrapper[4756]: I0318 14:19:43.916795 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-b7mcw"] Mar 18 14:19:43 crc kubenswrapper[4756]: W0318 14:19:43.920912 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8994855c_7079_4927_ac5d_1d875288634d.slice/crio-465223d3c738fa6a4069898fb3a2802d0714b764ea033f9168e29d9d6b4764bf WatchSource:0}: Error finding container 465223d3c738fa6a4069898fb3a2802d0714b764ea033f9168e29d9d6b4764bf: Status 404 returned error can't find the container with id 465223d3c738fa6a4069898fb3a2802d0714b764ea033f9168e29d9d6b4764bf Mar 18 14:19:44 crc kubenswrapper[4756]: W0318 14:19:44.218155 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c25b083_0273_4f26_8fb7_ca7e78907b8c.slice/crio-4d0cf8e5e185af36f8a31d1ac75f63ea2c6c93bcb811b50655b03d1d80720560 WatchSource:0}: Error finding container 4d0cf8e5e185af36f8a31d1ac75f63ea2c6c93bcb811b50655b03d1d80720560: Status 404 returned error can't find the container with id 4d0cf8e5e185af36f8a31d1ac75f63ea2c6c93bcb811b50655b03d1d80720560 Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.219296 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-d59qb"] Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.317107 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.318619 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.326691 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.326769 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.326927 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.326978 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.327166 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.327553 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.327861 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6nmfp" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.334870 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.431300 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" event={"ID":"8994855c-7079-4927-ac5d-1d875288634d","Type":"ContainerStarted","Data":"465223d3c738fa6a4069898fb3a2802d0714b764ea033f9168e29d9d6b4764bf"} Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.431946 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/228ca85e-a493-4dc4-9b95-5148c92ba228-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.431999 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.432029 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-585hj\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-kube-api-access-585hj\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.432055 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.432083 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.432189 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.432214 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.432235 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.432257 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/228ca85e-a493-4dc4-9b95-5148c92ba228-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.432290 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.432488 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.433755 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" event={"ID":"7c25b083-0273-4f26-8fb7-ca7e78907b8c","Type":"ContainerStarted","Data":"4d0cf8e5e185af36f8a31d1ac75f63ea2c6c93bcb811b50655b03d1d80720560"} Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.534159 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.534215 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.534240 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.534262 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/228ca85e-a493-4dc4-9b95-5148c92ba228-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.534297 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.534370 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.534430 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/228ca85e-a493-4dc4-9b95-5148c92ba228-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.534467 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.534523 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-585hj\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-kube-api-access-585hj\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.534547 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.534600 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.534867 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.535103 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.535304 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.535842 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.536733 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.538847 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.538889 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/74db1f4c9ab2a4ba86b05e72f95f0efbae4c6a7d3cd1801a16cdc636a751481a/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.545531 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/228ca85e-a493-4dc4-9b95-5148c92ba228-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.545552 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.549606 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.553800 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-585hj\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-kube-api-access-585hj\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.581677 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/228ca85e-a493-4dc4-9b95-5148c92ba228-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.606312 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\") pod \"rabbitmq-cell1-server-0\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.608950 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.610175 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.614581 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.617492 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.617729 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.617837 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.618009 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.618606 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.618705 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-stdr5" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.630619 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.647470 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.742843 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.742888 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-config-data\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.742909 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.742947 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.742961 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.742990 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.743032 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.743055 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.743071 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.743104 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc68v\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-kube-api-access-xc68v\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.743135 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.844922 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.844978 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.844999 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.845033 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc68v\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-kube-api-access-xc68v\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.845063 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.845082 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.845099 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-config-data\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.845136 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.845174 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.845189 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.845219 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.845787 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.847055 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.847503 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.847594 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-config-data\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.847855 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.850596 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.860824 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.863483 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.863930 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.870509 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc68v\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-kube-api-access-xc68v\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.872686 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.872726 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/38f97d1707111d96e116f25286c985a7c009ee85bc99aa932c406463acfc1268/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.928291 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") pod \"rabbitmq-server-0\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " pod="openstack/rabbitmq-server-0" Mar 18 14:19:44 crc kubenswrapper[4756]: I0318 14:19:44.965153 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.870625 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.873732 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.880466 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-t92rl" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.880589 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.882432 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.883304 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.885471 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.891921 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.976433 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.976503 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b375193f-5cd6-4396-8230-04033cd32a8e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b375193f-5cd6-4396-8230-04033cd32a8e\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.976543 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.976588 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg9px\" (UniqueName: \"kubernetes.io/projected/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-kube-api-access-sg9px\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.976631 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.976791 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-config-data-default\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.976973 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:45 crc kubenswrapper[4756]: I0318 14:19:45.977137 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-kolla-config\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.079288 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-config-data-default\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.079348 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.079375 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-kolla-config\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.079398 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.079427 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b375193f-5cd6-4396-8230-04033cd32a8e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b375193f-5cd6-4396-8230-04033cd32a8e\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.079452 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.079484 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg9px\" (UniqueName: \"kubernetes.io/projected/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-kube-api-access-sg9px\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.079537 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.079907 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.080701 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-config-data-default\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.081819 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-kolla-config\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.082775 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.085325 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.085392 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b375193f-5cd6-4396-8230-04033cd32a8e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b375193f-5cd6-4396-8230-04033cd32a8e\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/46132203f49c394438e4c72b9520b1fba15ad92354f6ba4370b9e16d21a6f8a7/globalmount\"" pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.086156 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.096218 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.114778 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg9px\" (UniqueName: \"kubernetes.io/projected/7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe-kube-api-access-sg9px\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.130707 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b375193f-5cd6-4396-8230-04033cd32a8e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b375193f-5cd6-4396-8230-04033cd32a8e\") pod \"openstack-galera-0\" (UID: \"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe\") " pod="openstack/openstack-galera-0" Mar 18 14:19:46 crc kubenswrapper[4756]: I0318 14:19:46.218510 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.106825 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.108174 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.110815 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-n4cgd" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.111657 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.111843 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.112093 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.130387 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.197969 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8c6500d1-adb1-4f70-af01-1b8ae2351293\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c6500d1-adb1-4f70-af01-1b8ae2351293\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.198019 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.198041 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsrwx\" (UniqueName: \"kubernetes.io/projected/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-kube-api-access-wsrwx\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.198154 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.198295 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.198420 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.198500 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.198595 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.299978 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.300066 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.300106 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.300166 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.300234 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsrwx\" (UniqueName: \"kubernetes.io/projected/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-kube-api-access-wsrwx\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.300257 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8c6500d1-adb1-4f70-af01-1b8ae2351293\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c6500d1-adb1-4f70-af01-1b8ae2351293\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.300277 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.300344 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.301226 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.301968 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.302004 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.302013 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.306030 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.306527 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.307295 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.307324 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8c6500d1-adb1-4f70-af01-1b8ae2351293\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c6500d1-adb1-4f70-af01-1b8ae2351293\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/de9e9625eaf790b6ae471020d97cf531bd2520ac444cdcefb51447c5502488d9/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.323419 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsrwx\" (UniqueName: \"kubernetes.io/projected/b5d3bcfe-0ae1-4104-8433-ffb4569a29d8-kube-api-access-wsrwx\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.340472 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8c6500d1-adb1-4f70-af01-1b8ae2351293\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c6500d1-adb1-4f70-af01-1b8ae2351293\") pod \"openstack-cell1-galera-0\" (UID: \"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8\") " pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.437001 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.447195 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.449184 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.452335 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-jvlx8" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.452385 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.452438 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.465818 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.505421 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6zpz\" (UniqueName: \"kubernetes.io/projected/18faa7ad-0836-473a-aebe-0a6e5357b554-kube-api-access-k6zpz\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.505588 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18faa7ad-0836-473a-aebe-0a6e5357b554-config-data\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.505650 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/18faa7ad-0836-473a-aebe-0a6e5357b554-kolla-config\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.505719 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18faa7ad-0836-473a-aebe-0a6e5357b554-combined-ca-bundle\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.505813 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/18faa7ad-0836-473a-aebe-0a6e5357b554-memcached-tls-certs\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.607863 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6zpz\" (UniqueName: \"kubernetes.io/projected/18faa7ad-0836-473a-aebe-0a6e5357b554-kube-api-access-k6zpz\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.607943 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18faa7ad-0836-473a-aebe-0a6e5357b554-config-data\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.607975 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/18faa7ad-0836-473a-aebe-0a6e5357b554-kolla-config\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.608011 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18faa7ad-0836-473a-aebe-0a6e5357b554-combined-ca-bundle\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.608045 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/18faa7ad-0836-473a-aebe-0a6e5357b554-memcached-tls-certs\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.608931 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/18faa7ad-0836-473a-aebe-0a6e5357b554-kolla-config\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.610144 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18faa7ad-0836-473a-aebe-0a6e5357b554-config-data\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.612512 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/18faa7ad-0836-473a-aebe-0a6e5357b554-memcached-tls-certs\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.614543 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18faa7ad-0836-473a-aebe-0a6e5357b554-combined-ca-bundle\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.625984 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6zpz\" (UniqueName: \"kubernetes.io/projected/18faa7ad-0836-473a-aebe-0a6e5357b554-kube-api-access-k6zpz\") pod \"memcached-0\" (UID: \"18faa7ad-0836-473a-aebe-0a6e5357b554\") " pod="openstack/memcached-0" Mar 18 14:19:47 crc kubenswrapper[4756]: I0318 14:19:47.775617 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 14:19:49 crc kubenswrapper[4756]: I0318 14:19:49.629552 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 14:19:49 crc kubenswrapper[4756]: I0318 14:19:49.630695 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 14:19:49 crc kubenswrapper[4756]: I0318 14:19:49.634142 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-52gj6" Mar 18 14:19:49 crc kubenswrapper[4756]: I0318 14:19:49.647876 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 14:19:49 crc kubenswrapper[4756]: I0318 14:19:49.741164 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46x77\" (UniqueName: \"kubernetes.io/projected/67dc4771-f106-4f33-9f84-0d7251e4259d-kube-api-access-46x77\") pod \"kube-state-metrics-0\" (UID: \"67dc4771-f106-4f33-9f84-0d7251e4259d\") " pod="openstack/kube-state-metrics-0" Mar 18 14:19:49 crc kubenswrapper[4756]: I0318 14:19:49.842179 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46x77\" (UniqueName: \"kubernetes.io/projected/67dc4771-f106-4f33-9f84-0d7251e4259d-kube-api-access-46x77\") pod \"kube-state-metrics-0\" (UID: \"67dc4771-f106-4f33-9f84-0d7251e4259d\") " pod="openstack/kube-state-metrics-0" Mar 18 14:19:49 crc kubenswrapper[4756]: I0318 14:19:49.879159 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46x77\" (UniqueName: \"kubernetes.io/projected/67dc4771-f106-4f33-9f84-0d7251e4259d-kube-api-access-46x77\") pod \"kube-state-metrics-0\" (UID: \"67dc4771-f106-4f33-9f84-0d7251e4259d\") " pod="openstack/kube-state-metrics-0" Mar 18 14:19:49 crc kubenswrapper[4756]: I0318 14:19:49.947978 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.056040 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.062313 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.071572 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.071829 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.071920 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.071954 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.072212 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-5pdd8" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.097711 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.145969 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/529ae791-8631-4ca5-9e4b-bb857d6264a8-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.146025 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/529ae791-8631-4ca5-9e4b-bb857d6264a8-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.146057 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/529ae791-8631-4ca5-9e4b-bb857d6264a8-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.146078 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/529ae791-8631-4ca5-9e4b-bb857d6264a8-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.146100 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/529ae791-8631-4ca5-9e4b-bb857d6264a8-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.146134 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pgtw\" (UniqueName: \"kubernetes.io/projected/529ae791-8631-4ca5-9e4b-bb857d6264a8-kube-api-access-5pgtw\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.146158 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/529ae791-8631-4ca5-9e4b-bb857d6264a8-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.249884 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/529ae791-8631-4ca5-9e4b-bb857d6264a8-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.249937 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/529ae791-8631-4ca5-9e4b-bb857d6264a8-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.249964 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/529ae791-8631-4ca5-9e4b-bb857d6264a8-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.249985 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/529ae791-8631-4ca5-9e4b-bb857d6264a8-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.250008 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/529ae791-8631-4ca5-9e4b-bb857d6264a8-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.250024 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pgtw\" (UniqueName: \"kubernetes.io/projected/529ae791-8631-4ca5-9e4b-bb857d6264a8-kube-api-access-5pgtw\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.250045 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/529ae791-8631-4ca5-9e4b-bb857d6264a8-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.253348 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/529ae791-8631-4ca5-9e4b-bb857d6264a8-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.253695 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/529ae791-8631-4ca5-9e4b-bb857d6264a8-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.260063 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/529ae791-8631-4ca5-9e4b-bb857d6264a8-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.260685 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/529ae791-8631-4ca5-9e4b-bb857d6264a8-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.265510 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/529ae791-8631-4ca5-9e4b-bb857d6264a8-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.271626 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/529ae791-8631-4ca5-9e4b-bb857d6264a8-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.282015 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pgtw\" (UniqueName: \"kubernetes.io/projected/529ae791-8631-4ca5-9e4b-bb857d6264a8-kube-api-access-5pgtw\") pod \"alertmanager-metric-storage-0\" (UID: \"529ae791-8631-4ca5-9e4b-bb857d6264a8\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:50 crc kubenswrapper[4756]: I0318 14:19:50.403011 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.066614 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.068757 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.073877 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.075186 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.075217 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.075361 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.075370 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kdkt7" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.078231 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.079330 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.079547 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.085447 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.163007 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-config\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.163068 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.163095 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.163133 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.163196 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d691251-328b-4c09-98d1-4b968ab5bc05-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.163336 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fdc3ac31-1c36-4851-94d7-354178e93595\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdc3ac31-1c36-4851-94d7-354178e93595\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.163379 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzf7r\" (UniqueName: \"kubernetes.io/projected/8d691251-328b-4c09-98d1-4b968ab5bc05-kube-api-access-kzf7r\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.163426 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d691251-328b-4c09-98d1-4b968ab5bc05-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.163472 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.163510 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.265702 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-config\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.265758 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.265791 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.265812 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.265865 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d691251-328b-4c09-98d1-4b968ab5bc05-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.265910 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fdc3ac31-1c36-4851-94d7-354178e93595\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdc3ac31-1c36-4851-94d7-354178e93595\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.265931 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzf7r\" (UniqueName: \"kubernetes.io/projected/8d691251-328b-4c09-98d1-4b968ab5bc05-kube-api-access-kzf7r\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.265960 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d691251-328b-4c09-98d1-4b968ab5bc05-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.265991 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.266017 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.266675 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.266957 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.267653 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.269753 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.269939 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fdc3ac31-1c36-4851-94d7-354178e93595\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdc3ac31-1c36-4851-94d7-354178e93595\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/810f25a37f71de248104173ceeca717706568db9cb74ba5fdae93f590561981a/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.270900 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.271269 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d691251-328b-4c09-98d1-4b968ab5bc05-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.271444 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.271941 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d691251-328b-4c09-98d1-4b968ab5bc05-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.275050 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-config\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.289668 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzf7r\" (UniqueName: \"kubernetes.io/projected/8d691251-328b-4c09-98d1-4b968ab5bc05-kube-api-access-kzf7r\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.307664 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fdc3ac31-1c36-4851-94d7-354178e93595\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdc3ac31-1c36-4851-94d7-354178e93595\") pod \"prometheus-metric-storage-0\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:51 crc kubenswrapper[4756]: I0318 14:19:51.391139 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.693154 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r6s8c"] Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.694775 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.697407 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-hktpz" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.697555 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.697660 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.701896 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-vsj9n"] Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.704600 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.729253 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vsj9n"] Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.736971 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r6s8c"] Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.805131 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3f5c347c-244f-40b6-8311-8eac0e22626a-var-log\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.805176 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-ovn-controller-tls-certs\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.805212 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-scripts\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.805235 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3f5c347c-244f-40b6-8311-8eac0e22626a-var-lib\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.805265 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-var-run\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.805280 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-var-run-ovn\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.805307 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f5c347c-244f-40b6-8311-8eac0e22626a-scripts\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.805325 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll9mf\" (UniqueName: \"kubernetes.io/projected/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-kube-api-access-ll9mf\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.805377 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwtrs\" (UniqueName: \"kubernetes.io/projected/3f5c347c-244f-40b6-8311-8eac0e22626a-kube-api-access-mwtrs\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.805400 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3f5c347c-244f-40b6-8311-8eac0e22626a-var-run\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.805424 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-combined-ca-bundle\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.805446 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3f5c347c-244f-40b6-8311-8eac0e22626a-etc-ovs\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.805468 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-var-log-ovn\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.821319 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.823412 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.826040 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.829779 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.830055 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.830485 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.831084 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-lh45f" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.834516 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.906737 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-var-log-ovn\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.906778 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.906806 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3f5c347c-244f-40b6-8311-8eac0e22626a-var-log\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.906830 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.907289 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-var-log-ovn\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.907342 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3f5c347c-244f-40b6-8311-8eac0e22626a-var-log\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.907407 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-ovn-controller-tls-certs\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.907429 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-config\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908018 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-scripts\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908055 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3f5c347c-244f-40b6-8311-8eac0e22626a-var-lib\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908100 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0ca5b05d-84e2-4ebd-92c5-9a8d0bf1ccbf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ca5b05d-84e2-4ebd-92c5-9a8d0bf1ccbf\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908151 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-var-run\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908178 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-var-run-ovn\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908212 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908251 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f5c347c-244f-40b6-8311-8eac0e22626a-scripts\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908282 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll9mf\" (UniqueName: \"kubernetes.io/projected/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-kube-api-access-ll9mf\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908314 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908341 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908365 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwtrs\" (UniqueName: \"kubernetes.io/projected/3f5c347c-244f-40b6-8311-8eac0e22626a-kube-api-access-mwtrs\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908383 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn2vz\" (UniqueName: \"kubernetes.io/projected/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-kube-api-access-pn2vz\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908403 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3f5c347c-244f-40b6-8311-8eac0e22626a-var-run\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908432 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-combined-ca-bundle\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908440 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-var-run\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908459 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3f5c347c-244f-40b6-8311-8eac0e22626a-etc-ovs\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908490 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-var-run-ovn\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908504 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3f5c347c-244f-40b6-8311-8eac0e22626a-var-run\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908587 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3f5c347c-244f-40b6-8311-8eac0e22626a-var-lib\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.908679 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3f5c347c-244f-40b6-8311-8eac0e22626a-etc-ovs\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.910138 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f5c347c-244f-40b6-8311-8eac0e22626a-scripts\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.913440 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-scripts\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.915836 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-combined-ca-bundle\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.916140 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-ovn-controller-tls-certs\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.923165 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll9mf\" (UniqueName: \"kubernetes.io/projected/99dfb896-59f3-4f93-8d0e-4b19b49cbc56-kube-api-access-ll9mf\") pod \"ovn-controller-r6s8c\" (UID: \"99dfb896-59f3-4f93-8d0e-4b19b49cbc56\") " pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:53 crc kubenswrapper[4756]: I0318 14:19:53.923706 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwtrs\" (UniqueName: \"kubernetes.io/projected/3f5c347c-244f-40b6-8311-8eac0e22626a-kube-api-access-mwtrs\") pod \"ovn-controller-ovs-vsj9n\" (UID: \"3f5c347c-244f-40b6-8311-8eac0e22626a\") " pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.010202 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.010251 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.010274 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-config\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.010313 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0ca5b05d-84e2-4ebd-92c5-9a8d0bf1ccbf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ca5b05d-84e2-4ebd-92c5-9a8d0bf1ccbf\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.010346 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.010385 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.010408 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.010431 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn2vz\" (UniqueName: \"kubernetes.io/projected/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-kube-api-access-pn2vz\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.010752 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.011277 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-config\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.011571 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.014036 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.017583 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.018026 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.018062 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0ca5b05d-84e2-4ebd-92c5-9a8d0bf1ccbf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ca5b05d-84e2-4ebd-92c5-9a8d0bf1ccbf\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/56b26495b120ed538ba2bdfcceda7a12acbac01877dcd0829fdc501771a1c0b5/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.023779 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.025217 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn2vz\" (UniqueName: \"kubernetes.io/projected/ba7aaad4-94c3-4202-a512-a84cba9bcb9f-kube-api-access-pn2vz\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.030988 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r6s8c" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.038090 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.073776 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0ca5b05d-84e2-4ebd-92c5-9a8d0bf1ccbf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ca5b05d-84e2-4ebd-92c5-9a8d0bf1ccbf\") pod \"ovsdbserver-nb-0\" (UID: \"ba7aaad4-94c3-4202-a512-a84cba9bcb9f\") " pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:54 crc kubenswrapper[4756]: I0318 14:19:54.146798 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.350908 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd"] Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.359238 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.362450 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.362523 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.362683 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.362844 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.364146 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-gddbb" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.367307 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd"] Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.482370 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7acb694-7937-45a7-8aab-c2175fae6423-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.482427 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7acb694-7937-45a7-8aab-c2175fae6423-config\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.482453 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7lgr\" (UniqueName: \"kubernetes.io/projected/e7acb694-7937-45a7-8aab-c2175fae6423-kube-api-access-l7lgr\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.482480 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/e7acb694-7937-45a7-8aab-c2175fae6423-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.482639 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/e7acb694-7937-45a7-8aab-c2175fae6423-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.522198 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.543044 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4"] Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.543226 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.544749 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.546022 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.555856 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-j2h7x" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.565285 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.565897 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.566770 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.587244 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7acb694-7937-45a7-8aab-c2175fae6423-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.587358 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7acb694-7937-45a7-8aab-c2175fae6423-config\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.587405 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7lgr\" (UniqueName: \"kubernetes.io/projected/e7acb694-7937-45a7-8aab-c2175fae6423-kube-api-access-l7lgr\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.587438 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/e7acb694-7937-45a7-8aab-c2175fae6423-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.587484 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/e7acb694-7937-45a7-8aab-c2175fae6423-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.587995 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.588768 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7acb694-7937-45a7-8aab-c2175fae6423-config\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.588968 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7acb694-7937-45a7-8aab-c2175fae6423-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.595963 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.596266 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.608700 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/e7acb694-7937-45a7-8aab-c2175fae6423-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.609487 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/e7acb694-7937-45a7-8aab-c2175fae6423-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.618540 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4"] Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.627401 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7lgr\" (UniqueName: \"kubernetes.io/projected/e7acb694-7937-45a7-8aab-c2175fae6423-kube-api-access-l7lgr\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd\" (UID: \"e7acb694-7937-45a7-8aab-c2175fae6423\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.660860 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5"] Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.668032 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.672488 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.672948 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.690173 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691014 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/7edadc4c-acee-49a2-b629-c4505d40eebc-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691067 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691089 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7edadc4c-acee-49a2-b629-c4505d40eebc-config\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691202 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691284 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/7edadc4c-acee-49a2-b629-c4505d40eebc-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691355 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691380 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691402 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7edadc4c-acee-49a2-b629-c4505d40eebc-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691466 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f2f57c01-7a0e-42c8-b6ef-6aa247f06a53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2f57c01-7a0e-42c8-b6ef-6aa247f06a53\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691503 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/7edadc4c-acee-49a2-b629-c4505d40eebc-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691527 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt769\" (UniqueName: \"kubernetes.io/projected/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-kube-api-access-zt769\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691549 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cszm8\" (UniqueName: \"kubernetes.io/projected/7edadc4c-acee-49a2-b629-c4505d40eebc-kube-api-access-cszm8\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691570 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.691675 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-config\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.700696 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5"] Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.765578 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5"] Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.766824 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.769751 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.770283 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-vzqxj" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.770365 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.770449 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.770565 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.771627 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.771873 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5"] Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.773427 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.793763 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g"] Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.794473 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.794553 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.794638 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7edadc4c-acee-49a2-b629-c4505d40eebc-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.794726 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4999c7cd-7963-42e4-8404-a0203664d331-config\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.794802 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/4999c7cd-7963-42e4-8404-a0203664d331-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.794878 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f2f57c01-7a0e-42c8-b6ef-6aa247f06a53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2f57c01-7a0e-42c8-b6ef-6aa247f06a53\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.794956 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/7edadc4c-acee-49a2-b629-c4505d40eebc-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.795031 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt769\" (UniqueName: \"kubernetes.io/projected/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-kube-api-access-zt769\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.795130 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cszm8\" (UniqueName: \"kubernetes.io/projected/7edadc4c-acee-49a2-b629-c4505d40eebc-kube-api-access-cszm8\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.795207 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.795302 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-config\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.795399 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/7edadc4c-acee-49a2-b629-c4505d40eebc-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.795483 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.795550 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7edadc4c-acee-49a2-b629-c4505d40eebc-config\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.795631 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4q6m\" (UniqueName: \"kubernetes.io/projected/4999c7cd-7963-42e4-8404-a0203664d331-kube-api-access-w4q6m\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.795706 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7edadc4c-acee-49a2-b629-c4505d40eebc-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.795774 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.795849 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4999c7cd-7963-42e4-8404-a0203664d331-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.795933 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/7edadc4c-acee-49a2-b629-c4505d40eebc-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.796006 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/4999c7cd-7963-42e4-8404-a0203664d331-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.794917 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.800269 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.800856 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-config\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.801427 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.802637 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7edadc4c-acee-49a2-b629-c4505d40eebc-config\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.803508 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.808244 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.811045 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.811580 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/7edadc4c-acee-49a2-b629-c4505d40eebc-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.813453 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/7edadc4c-acee-49a2-b629-c4505d40eebc-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.818704 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/7edadc4c-acee-49a2-b629-c4505d40eebc-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.819690 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.819724 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f2f57c01-7a0e-42c8-b6ef-6aa247f06a53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2f57c01-7a0e-42c8-b6ef-6aa247f06a53\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/08a91e828d0530ba719cf9a4126f48c11f888cd3a68d0e9ce87d6d413467c583/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.823707 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cszm8\" (UniqueName: \"kubernetes.io/projected/7edadc4c-acee-49a2-b629-c4505d40eebc-kube-api-access-cszm8\") pod \"cloudkitty-lokistack-querier-668f98fdd7-sv5g4\" (UID: \"7edadc4c-acee-49a2-b629-c4505d40eebc\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.834473 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt769\" (UniqueName: \"kubernetes.io/projected/ee4b1cdc-8b62-42b6-9bc7-61164f90afb4-kube-api-access-zt769\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.836895 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g"] Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.865083 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f2f57c01-7a0e-42c8-b6ef-6aa247f06a53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2f57c01-7a0e-42c8-b6ef-6aa247f06a53\") pod \"ovsdbserver-sb-0\" (UID: \"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4\") " pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.897821 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/4999c7cd-7963-42e4-8404-a0203664d331-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.897888 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a3a9e126-317d-4f52-a4f8-e657dfa9930c-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.897914 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.897941 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4999c7cd-7963-42e4-8404-a0203664d331-config\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.897962 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/4999c7cd-7963-42e4-8404-a0203664d331-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.897983 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffjxx\" (UniqueName: \"kubernetes.io/projected/0e90a533-9bd6-4f94-a8c4-52218f2919b0-kube-api-access-ffjxx\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.898078 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/0e90a533-9bd6-4f94-a8c4-52218f2919b0-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.898133 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a3a9e126-317d-4f52-a4f8-e657dfa9930c-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.898154 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a3a9e126-317d-4f52-a4f8-e657dfa9930c-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.898214 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/0e90a533-9bd6-4f94-a8c4-52218f2919b0-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.899019 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4999c7cd-7963-42e4-8404-a0203664d331-config\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.899709 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.899809 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.899846 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.899925 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.899944 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxqdz\" (UniqueName: \"kubernetes.io/projected/a3a9e126-317d-4f52-a4f8-e657dfa9930c-kube-api-access-kxqdz\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.899996 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.900026 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.900048 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/0e90a533-9bd6-4f94-a8c4-52218f2919b0-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.900073 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4q6m\" (UniqueName: \"kubernetes.io/projected/4999c7cd-7963-42e4-8404-a0203664d331-kube-api-access-w4q6m\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.900300 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4999c7cd-7963-42e4-8404-a0203664d331-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.900334 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.900383 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.900436 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.900640 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/4999c7cd-7963-42e4-8404-a0203664d331-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.901075 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4999c7cd-7963-42e4-8404-a0203664d331-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.901690 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/4999c7cd-7963-42e4-8404-a0203664d331-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.916463 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4q6m\" (UniqueName: \"kubernetes.io/projected/4999c7cd-7963-42e4-8404-a0203664d331-kube-api-access-w4q6m\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-4pgf5\" (UID: \"4999c7cd-7963-42e4-8404-a0203664d331\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.965353 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 14:19:57 crc kubenswrapper[4756]: I0318 14:19:57.978973 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001475 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001693 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001732 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxqdz\" (UniqueName: \"kubernetes.io/projected/a3a9e126-317d-4f52-a4f8-e657dfa9930c-kube-api-access-kxqdz\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001762 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001784 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001803 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/0e90a533-9bd6-4f94-a8c4-52218f2919b0-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001828 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001848 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001865 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001897 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a3a9e126-317d-4f52-a4f8-e657dfa9930c-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001916 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001946 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffjxx\" (UniqueName: \"kubernetes.io/projected/0e90a533-9bd6-4f94-a8c4-52218f2919b0-kube-api-access-ffjxx\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001964 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/0e90a533-9bd6-4f94-a8c4-52218f2919b0-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001978 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a3a9e126-317d-4f52-a4f8-e657dfa9930c-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.001995 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a3a9e126-317d-4f52-a4f8-e657dfa9930c-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.002013 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/0e90a533-9bd6-4f94-a8c4-52218f2919b0-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.002047 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.002072 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.002089 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.003041 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.003644 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.004962 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a3a9e126-317d-4f52-a4f8-e657dfa9930c-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.005983 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.007275 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.007299 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.007352 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.007906 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.008083 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.008673 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/0e90a533-9bd6-4f94-a8c4-52218f2919b0-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.008765 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/0e90a533-9bd6-4f94-a8c4-52218f2919b0-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.009340 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a3a9e126-317d-4f52-a4f8-e657dfa9930c-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.010076 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e90a533-9bd6-4f94-a8c4-52218f2919b0-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.010613 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/0e90a533-9bd6-4f94-a8c4-52218f2919b0-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.012957 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a3a9e126-317d-4f52-a4f8-e657dfa9930c-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.014036 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a3a9e126-317d-4f52-a4f8-e657dfa9930c-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.027668 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxqdz\" (UniqueName: \"kubernetes.io/projected/a3a9e126-317d-4f52-a4f8-e657dfa9930c-kube-api-access-kxqdz\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g\" (UID: \"a3a9e126-317d-4f52-a4f8-e657dfa9930c\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.030062 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffjxx\" (UniqueName: \"kubernetes.io/projected/0e90a533-9bd6-4f94-a8c4-52218f2919b0-kube-api-access-ffjxx\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-p44r5\" (UID: \"0e90a533-9bd6-4f94-a8c4-52218f2919b0\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.088472 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.188778 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.515617 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.517111 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.523461 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.523502 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.529308 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.613483 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.613853 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twpl4\" (UniqueName: \"kubernetes.io/projected/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-kube-api-access-twpl4\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.613888 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.613919 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.613938 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.613960 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.613982 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.614016 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.614051 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.615066 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.622349 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.622465 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.634748 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.707313 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.708651 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.712497 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.713234 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.715536 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.715603 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/06157d4c-39c0-4895-b002-79ee7b960512-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.715636 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.715669 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8tld\" (UniqueName: \"kubernetes.io/projected/06157d4c-39c0-4895-b002-79ee7b960512-kube-api-access-g8tld\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.715708 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.715762 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06157d4c-39c0-4895-b002-79ee7b960512-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.715821 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twpl4\" (UniqueName: \"kubernetes.io/projected/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-kube-api-access-twpl4\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.715853 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.715890 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.715921 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.715948 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.715977 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/06157d4c-39c0-4895-b002-79ee7b960512-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.716005 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/06157d4c-39c0-4895-b002-79ee7b960512-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.716031 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06157d4c-39c0-4895-b002-79ee7b960512-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.716057 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.717662 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.718189 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.718991 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.719595 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.722916 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.729556 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.739659 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.740493 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twpl4\" (UniqueName: \"kubernetes.io/projected/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-kube-api-access-twpl4\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.764683 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/ca8fb9b6-a1a9-4781-af8f-2e7e78e62771-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.765534 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.787543 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.818695 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/06157d4c-39c0-4895-b002-79ee7b960512-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.818744 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.818792 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/3528f495-dffb-47d3-99fe-69054008e8cd-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.818821 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8tld\" (UniqueName: \"kubernetes.io/projected/06157d4c-39c0-4895-b002-79ee7b960512-kube-api-access-g8tld\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.818847 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/3528f495-dffb-47d3-99fe-69054008e8cd-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.818869 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.818912 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06157d4c-39c0-4895-b002-79ee7b960512-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.818936 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3528f495-dffb-47d3-99fe-69054008e8cd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.819012 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/3528f495-dffb-47d3-99fe-69054008e8cd-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.819031 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vj65\" (UniqueName: \"kubernetes.io/projected/3528f495-dffb-47d3-99fe-69054008e8cd-kube-api-access-4vj65\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.819054 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/06157d4c-39c0-4895-b002-79ee7b960512-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.819071 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/06157d4c-39c0-4895-b002-79ee7b960512-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.819091 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06157d4c-39c0-4895-b002-79ee7b960512-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.819136 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3528f495-dffb-47d3-99fe-69054008e8cd-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.819992 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.821456 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06157d4c-39c0-4895-b002-79ee7b960512-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.823054 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06157d4c-39c0-4895-b002-79ee7b960512-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.824558 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/06157d4c-39c0-4895-b002-79ee7b960512-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.834683 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.838845 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/06157d4c-39c0-4895-b002-79ee7b960512-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.841194 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/06157d4c-39c0-4895-b002-79ee7b960512-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.846245 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8tld\" (UniqueName: \"kubernetes.io/projected/06157d4c-39c0-4895-b002-79ee7b960512-kube-api-access-g8tld\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.860643 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"06157d4c-39c0-4895-b002-79ee7b960512\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.920924 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3528f495-dffb-47d3-99fe-69054008e8cd-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.921282 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/3528f495-dffb-47d3-99fe-69054008e8cd-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.921315 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/3528f495-dffb-47d3-99fe-69054008e8cd-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.921336 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.921383 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3528f495-dffb-47d3-99fe-69054008e8cd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.921446 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/3528f495-dffb-47d3-99fe-69054008e8cd-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.921462 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vj65\" (UniqueName: \"kubernetes.io/projected/3528f495-dffb-47d3-99fe-69054008e8cd-kube-api-access-4vj65\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.922272 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3528f495-dffb-47d3-99fe-69054008e8cd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.922961 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.923151 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3528f495-dffb-47d3-99fe-69054008e8cd-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.925547 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/3528f495-dffb-47d3-99fe-69054008e8cd-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.925792 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/3528f495-dffb-47d3-99fe-69054008e8cd-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.928373 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/3528f495-dffb-47d3-99fe-69054008e8cd-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.937676 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vj65\" (UniqueName: \"kubernetes.io/projected/3528f495-dffb-47d3-99fe-69054008e8cd-kube-api-access-4vj65\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.944636 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"3528f495-dffb-47d3-99fe-69054008e8cd\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:58 crc kubenswrapper[4756]: I0318 14:19:58.970294 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:19:59 crc kubenswrapper[4756]: I0318 14:19:59.143828 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 14:19:59 crc kubenswrapper[4756]: I0318 14:19:59.185090 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:19:59 crc kubenswrapper[4756]: E0318 14:19:59.683104 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 14:19:59 crc kubenswrapper[4756]: E0318 14:19:59.683631 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-plltk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-fvc2v_openstack(b8398a92-1446-4d7f-b776-ba94565d42ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 14:19:59 crc kubenswrapper[4756]: E0318 14:19:59.684852 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" podUID="b8398a92-1446-4d7f-b776-ba94565d42ee" Mar 18 14:19:59 crc kubenswrapper[4756]: E0318 14:19:59.751508 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 14:19:59 crc kubenswrapper[4756]: E0318 14:19:59.751691 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-twl8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-6wghm_openstack(7533a2b3-c9f8-4594-ba41-f200d085e0de): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 14:19:59 crc kubenswrapper[4756]: E0318 14:19:59.753099 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-6wghm" podUID="7533a2b3-c9f8-4594-ba41-f200d085e0de" Mar 18 14:19:59 crc kubenswrapper[4756]: W0318 14:19:59.781012 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18faa7ad_0836_473a_aebe_0a6e5357b554.slice/crio-c496da8682c2f2ae29c6d87dd609630f55ac675fbd8950fd7a489f11ebedb219 WatchSource:0}: Error finding container c496da8682c2f2ae29c6d87dd609630f55ac675fbd8950fd7a489f11ebedb219: Status 404 returned error can't find the container with id c496da8682c2f2ae29c6d87dd609630f55ac675fbd8950fd7a489f11ebedb219 Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.108128 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.145911 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564060-mkctk"] Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.148198 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564060-mkctk" Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.150743 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.151293 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.152914 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564060-mkctk"] Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.153283 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.226653 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.242919 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vdzb\" (UniqueName: \"kubernetes.io/projected/d4432811-b291-4fac-a2e6-ad17c9d83f51-kube-api-access-5vdzb\") pod \"auto-csr-approver-29564060-mkctk\" (UID: \"d4432811-b291-4fac-a2e6-ad17c9d83f51\") " pod="openshift-infra/auto-csr-approver-29564060-mkctk" Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.343918 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vdzb\" (UniqueName: \"kubernetes.io/projected/d4432811-b291-4fac-a2e6-ad17c9d83f51-kube-api-access-5vdzb\") pod \"auto-csr-approver-29564060-mkctk\" (UID: \"d4432811-b291-4fac-a2e6-ad17c9d83f51\") " pod="openshift-infra/auto-csr-approver-29564060-mkctk" Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.370359 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vdzb\" (UniqueName: \"kubernetes.io/projected/d4432811-b291-4fac-a2e6-ad17c9d83f51-kube-api-access-5vdzb\") pod \"auto-csr-approver-29564060-mkctk\" (UID: \"d4432811-b291-4fac-a2e6-ad17c9d83f51\") " pod="openshift-infra/auto-csr-approver-29564060-mkctk" Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.582783 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564060-mkctk" Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.606815 4756 generic.go:334] "Generic (PLEG): container finished" podID="7c25b083-0273-4f26-8fb7-ca7e78907b8c" containerID="71080e2f7a923687a82b980e51f4a20a6b2641243d30bff3a1c4f9763a551a86" exitCode=0 Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.606869 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" event={"ID":"7c25b083-0273-4f26-8fb7-ca7e78907b8c","Type":"ContainerDied","Data":"71080e2f7a923687a82b980e51f4a20a6b2641243d30bff3a1c4f9763a551a86"} Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.608660 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"18faa7ad-0836-473a-aebe-0a6e5357b554","Type":"ContainerStarted","Data":"c496da8682c2f2ae29c6d87dd609630f55ac675fbd8950fd7a489f11ebedb219"} Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.610871 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce","Type":"ContainerStarted","Data":"be8855feea12b5d5b568bbf4c2eb5c2979962e937b09985eb3d98a6107dbbe85"} Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.611852 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"67dc4771-f106-4f33-9f84-0d7251e4259d","Type":"ContainerStarted","Data":"68c770138fc5e00ede674effb0b25a284c8f570dfecf7c4a1f6315c49e4168a9"} Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.614464 4756 generic.go:334] "Generic (PLEG): container finished" podID="8994855c-7079-4927-ac5d-1d875288634d" containerID="d8f44e38dd13ddcebf48393e47e6f7edb28b586f4c3fd8adf21387278542e8a7" exitCode=0 Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.614693 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" event={"ID":"8994855c-7079-4927-ac5d-1d875288634d","Type":"ContainerDied","Data":"d8f44e38dd13ddcebf48393e47e6f7edb28b586f4c3fd8adf21387278542e8a7"} Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.814968 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.875561 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.885624 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 18 14:20:00 crc kubenswrapper[4756]: I0318 14:20:00.894143 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 14:20:00 crc kubenswrapper[4756]: W0318 14:20:00.983023 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5d3bcfe_0ae1_4104_8433_ffb4569a29d8.slice/crio-828a8dc36d34237f68dc28d0a9fc55fa956a36cc9ab5898f523fe5d42feff10a WatchSource:0}: Error finding container 828a8dc36d34237f68dc28d0a9fc55fa956a36cc9ab5898f523fe5d42feff10a: Status 404 returned error can't find the container with id 828a8dc36d34237f68dc28d0a9fc55fa956a36cc9ab5898f523fe5d42feff10a Mar 18 14:20:00 crc kubenswrapper[4756]: W0318 14:20:00.985146 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d691251_328b_4c09_98d1_4b968ab5bc05.slice/crio-9e2d53bb4e52ab0966a6dbb3cb60d42277781bd3918282f1d9b7431ffc45d86e WatchSource:0}: Error finding container 9e2d53bb4e52ab0966a6dbb3cb60d42277781bd3918282f1d9b7431ffc45d86e: Status 404 returned error can't find the container with id 9e2d53bb4e52ab0966a6dbb3cb60d42277781bd3918282f1d9b7431ffc45d86e Mar 18 14:20:00 crc kubenswrapper[4756]: W0318 14:20:00.989874 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod529ae791_8631_4ca5_9e4b_bb857d6264a8.slice/crio-c173c9d62a1175f9d14fa83de98040e9121ae1379344835501437cfe2f0ce0bb WatchSource:0}: Error finding container c173c9d62a1175f9d14fa83de98040e9121ae1379344835501437cfe2f0ce0bb: Status 404 returned error can't find the container with id c173c9d62a1175f9d14fa83de98040e9121ae1379344835501437cfe2f0ce0bb Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.388351 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4"] Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.425163 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5"] Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.433525 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd"] Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.447350 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r6s8c"] Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.461469 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g"] Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.468896 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.499709 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.533561 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.541780 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5"] Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.549106 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.554692 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.588975 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564060-mkctk"] Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.607141 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vsj9n"] Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.623936 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" event={"ID":"8994855c-7079-4927-ac5d-1d875288634d","Type":"ContainerStarted","Data":"0ab432c8867b24b7c18085a28381de472c337873630cc7faa2b863a5c84b4a04"} Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.624145 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.627645 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" event={"ID":"7c25b083-0273-4f26-8fb7-ca7e78907b8c","Type":"ContainerStarted","Data":"74fb192257f4a49010e4b73ce8424a4b1a99661af99e75c492cc27e35a2d77ee"} Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.627784 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.628841 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"529ae791-8631-4ca5-9e4b-bb857d6264a8","Type":"ContainerStarted","Data":"c173c9d62a1175f9d14fa83de98040e9121ae1379344835501437cfe2f0ce0bb"} Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.630100 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d691251-328b-4c09-98d1-4b968ab5bc05","Type":"ContainerStarted","Data":"9e2d53bb4e52ab0966a6dbb3cb60d42277781bd3918282f1d9b7431ffc45d86e"} Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.631490 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8","Type":"ContainerStarted","Data":"828a8dc36d34237f68dc28d0a9fc55fa956a36cc9ab5898f523fe5d42feff10a"} Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.633247 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe","Type":"ContainerStarted","Data":"d9284bade841acb81fe095109a58a2a3f74845d04482ba4ec02084e8582697f9"} Mar 18 14:20:01 crc kubenswrapper[4756]: W0318 14:20:01.633833 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca8fb9b6_a1a9_4781_af8f_2e7e78e62771.slice/crio-6f0b47b34b62dffbfb638f77118475a71fce0aec7d4dff8ca8fc96ca46aabfce WatchSource:0}: Error finding container 6f0b47b34b62dffbfb638f77118475a71fce0aec7d4dff8ca8fc96ca46aabfce: Status 404 returned error can't find the container with id 6f0b47b34b62dffbfb638f77118475a71fce0aec7d4dff8ca8fc96ca46aabfce Mar 18 14:20:01 crc kubenswrapper[4756]: W0318 14:20:01.637264 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99dfb896_59f3_4f93_8d0e_4b19b49cbc56.slice/crio-244f07c249dd2dd52d1d13b2f4f2ceb63983bc16983d1e2382018f108cbaaa08 WatchSource:0}: Error finding container 244f07c249dd2dd52d1d13b2f4f2ceb63983bc16983d1e2382018f108cbaaa08: Status 404 returned error can't find the container with id 244f07c249dd2dd52d1d13b2f4f2ceb63983bc16983d1e2382018f108cbaaa08 Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.644812 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" podStartSLOduration=2.520091212 podStartE2EDuration="18.644791578s" podCreationTimestamp="2026-03-18 14:19:43 +0000 UTC" firstStartedPulling="2026-03-18 14:19:43.924105128 +0000 UTC m=+1185.238523103" lastFinishedPulling="2026-03-18 14:20:00.048805494 +0000 UTC m=+1201.363223469" observedRunningTime="2026-03-18 14:20:01.640901093 +0000 UTC m=+1202.955319078" watchObservedRunningTime="2026-03-18 14:20:01.644791578 +0000 UTC m=+1202.959209573" Mar 18 14:20:01 crc kubenswrapper[4756]: I0318 14:20:01.661455 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" podStartSLOduration=2.899914263 podStartE2EDuration="18.661436099s" podCreationTimestamp="2026-03-18 14:19:43 +0000 UTC" firstStartedPulling="2026-03-18 14:19:44.220846771 +0000 UTC m=+1185.535264746" lastFinishedPulling="2026-03-18 14:19:59.982368607 +0000 UTC m=+1201.296786582" observedRunningTime="2026-03-18 14:20:01.654529321 +0000 UTC m=+1202.968947296" watchObservedRunningTime="2026-03-18 14:20:01.661436099 +0000 UTC m=+1202.975854074" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.152477 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 14:20:02 crc kubenswrapper[4756]: W0318 14:20:02.419259 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4432811_b291_4fac_a2e6_ad17c9d83f51.slice/crio-3231feb74a862ffdb822b39057816aaa4476508d68101ef6629229c10856de16 WatchSource:0}: Error finding container 3231feb74a862ffdb822b39057816aaa4476508d68101ef6629229c10856de16: Status 404 returned error can't find the container with id 3231feb74a862ffdb822b39057816aaa4476508d68101ef6629229c10856de16 Mar 18 14:20:02 crc kubenswrapper[4756]: W0318 14:20:02.420799 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f5c347c_244f_40b6_8311_8eac0e22626a.slice/crio-d279a7951bca21e180589993c827e46d5a7e81143fbac5b3fca990f667cc9883 WatchSource:0}: Error finding container d279a7951bca21e180589993c827e46d5a7e81143fbac5b3fca990f667cc9883: Status 404 returned error can't find the container with id d279a7951bca21e180589993c827e46d5a7e81143fbac5b3fca990f667cc9883 Mar 18 14:20:02 crc kubenswrapper[4756]: W0318 14:20:02.430429 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba7aaad4_94c3_4202_a512_a84cba9bcb9f.slice/crio-cec820dfe8dda69df72154c9298b12b922f465e49a0ea0e83d782e71481d7ffd WatchSource:0}: Error finding container cec820dfe8dda69df72154c9298b12b922f465e49a0ea0e83d782e71481d7ffd: Status 404 returned error can't find the container with id cec820dfe8dda69df72154c9298b12b922f465e49a0ea0e83d782e71481d7ffd Mar 18 14:20:02 crc kubenswrapper[4756]: W0318 14:20:02.446012 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7edadc4c_acee_49a2_b629_c4505d40eebc.slice/crio-582c1d5f9a9aa50633f83fa365a84cbb263c351e5372630c6d43af3dbab0a616 WatchSource:0}: Error finding container 582c1d5f9a9aa50633f83fa365a84cbb263c351e5372630c6d43af3dbab0a616: Status 404 returned error can't find the container with id 582c1d5f9a9aa50633f83fa365a84cbb263c351e5372630c6d43af3dbab0a616 Mar 18 14:20:02 crc kubenswrapper[4756]: W0318 14:20:02.448006 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e90a533_9bd6_4f94_a8c4_52218f2919b0.slice/crio-1d816c2e18aa138c308f7b0c9883b47b84bb6674a06b232ca7174dec8c4a092b WatchSource:0}: Error finding container 1d816c2e18aa138c308f7b0c9883b47b84bb6674a06b232ca7174dec8c4a092b: Status 404 returned error can't find the container with id 1d816c2e18aa138c308f7b0c9883b47b84bb6674a06b232ca7174dec8c4a092b Mar 18 14:20:02 crc kubenswrapper[4756]: W0318 14:20:02.455990 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4999c7cd_7963_42e4_8404_a0203664d331.slice/crio-853638009f388cb7ea55aedd77ea894912377c57bf03ee28939f8a7200f0fbfd WatchSource:0}: Error finding container 853638009f388cb7ea55aedd77ea894912377c57bf03ee28939f8a7200f0fbfd: Status 404 returned error can't find the container with id 853638009f388cb7ea55aedd77ea894912377c57bf03ee28939f8a7200f0fbfd Mar 18 14:20:02 crc kubenswrapper[4756]: W0318 14:20:02.461522 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3a9e126_317d_4f52_a4f8_e657dfa9930c.slice/crio-d461a75a28ca7be8a8ad85070e2e93bab3aefc5574f7d96dcf7a49e8d52b87c6 WatchSource:0}: Error finding container d461a75a28ca7be8a8ad85070e2e93bab3aefc5574f7d96dcf7a49e8d52b87c6: Status 404 returned error can't find the container with id d461a75a28ca7be8a8ad85070e2e93bab3aefc5574f7d96dcf7a49e8d52b87c6 Mar 18 14:20:02 crc kubenswrapper[4756]: W0318 14:20:02.463264 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7acb694_7937_45a7_8aab_c2175fae6423.slice/crio-8b9391eb878b8cdd3af92aa7ff2c36da6d7a28afb04cc2066010d3a6ce500b56 WatchSource:0}: Error finding container 8b9391eb878b8cdd3af92aa7ff2c36da6d7a28afb04cc2066010d3a6ce500b56: Status 404 returned error can't find the container with id 8b9391eb878b8cdd3af92aa7ff2c36da6d7a28afb04cc2066010d3a6ce500b56 Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.529101 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6wghm" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.534503 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.642291 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" event={"ID":"e7acb694-7937-45a7-8aab-c2175fae6423","Type":"ContainerStarted","Data":"8b9391eb878b8cdd3af92aa7ff2c36da6d7a28afb04cc2066010d3a6ce500b56"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.643659 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ba7aaad4-94c3-4202-a512-a84cba9bcb9f","Type":"ContainerStarted","Data":"cec820dfe8dda69df72154c9298b12b922f465e49a0ea0e83d782e71481d7ffd"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.644875 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"228ca85e-a493-4dc4-9b95-5148c92ba228","Type":"ContainerStarted","Data":"720448d87238743264c16f885f312f38ac03efe053862a641da99b7ba5023af5"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.646068 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"06157d4c-39c0-4895-b002-79ee7b960512","Type":"ContainerStarted","Data":"ed8f0c25b01dfc93cd2d19a44089ff058216784eea5e1b78b1df2fc04cb16dec"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.647421 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6wghm" event={"ID":"7533a2b3-c9f8-4594-ba41-f200d085e0de","Type":"ContainerDied","Data":"70101f2c9010710b60165b191dac1185721908527682c9e8b45bd8d1690d94d0"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.647434 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6wghm" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.649175 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" event={"ID":"4999c7cd-7963-42e4-8404-a0203664d331","Type":"ContainerStarted","Data":"853638009f388cb7ea55aedd77ea894912377c57bf03ee28939f8a7200f0fbfd"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.650798 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564060-mkctk" event={"ID":"d4432811-b291-4fac-a2e6-ad17c9d83f51","Type":"ContainerStarted","Data":"3231feb74a862ffdb822b39057816aaa4476508d68101ef6629229c10856de16"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.652526 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4","Type":"ContainerStarted","Data":"1b5ed41da2c0131f57eacfc2c028e78ff1408bb0a2ade6a60ba237c0e8425d26"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.654081 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771","Type":"ContainerStarted","Data":"6f0b47b34b62dffbfb638f77118475a71fce0aec7d4dff8ca8fc96ca46aabfce"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.655446 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" event={"ID":"b8398a92-1446-4d7f-b776-ba94565d42ee","Type":"ContainerDied","Data":"fd7e0115ecac29f041ca2684ddf61322042bac96d2cd4c5256448b76fd011d99"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.655467 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fvc2v" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.657006 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"3528f495-dffb-47d3-99fe-69054008e8cd","Type":"ContainerStarted","Data":"3c701539dd29a7efe3489a7539d16234524d4d019bbc48454c2580a4a580149b"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.662170 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r6s8c" event={"ID":"99dfb896-59f3-4f93-8d0e-4b19b49cbc56","Type":"ContainerStarted","Data":"244f07c249dd2dd52d1d13b2f4f2ceb63983bc16983d1e2382018f108cbaaa08"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.663798 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" event={"ID":"7edadc4c-acee-49a2-b629-c4505d40eebc","Type":"ContainerStarted","Data":"582c1d5f9a9aa50633f83fa365a84cbb263c351e5372630c6d43af3dbab0a616"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.665478 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vsj9n" event={"ID":"3f5c347c-244f-40b6-8311-8eac0e22626a","Type":"ContainerStarted","Data":"d279a7951bca21e180589993c827e46d5a7e81143fbac5b3fca990f667cc9883"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.666747 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" event={"ID":"0e90a533-9bd6-4f94-a8c4-52218f2919b0","Type":"ContainerStarted","Data":"1d816c2e18aa138c308f7b0c9883b47b84bb6674a06b232ca7174dec8c4a092b"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.667908 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" event={"ID":"a3a9e126-317d-4f52-a4f8-e657dfa9930c","Type":"ContainerStarted","Data":"d461a75a28ca7be8a8ad85070e2e93bab3aefc5574f7d96dcf7a49e8d52b87c6"} Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.720799 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7533a2b3-c9f8-4594-ba41-f200d085e0de-config\") pod \"7533a2b3-c9f8-4594-ba41-f200d085e0de\" (UID: \"7533a2b3-c9f8-4594-ba41-f200d085e0de\") " Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.721006 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twl8f\" (UniqueName: \"kubernetes.io/projected/7533a2b3-c9f8-4594-ba41-f200d085e0de-kube-api-access-twl8f\") pod \"7533a2b3-c9f8-4594-ba41-f200d085e0de\" (UID: \"7533a2b3-c9f8-4594-ba41-f200d085e0de\") " Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.721047 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8398a92-1446-4d7f-b776-ba94565d42ee-config\") pod \"b8398a92-1446-4d7f-b776-ba94565d42ee\" (UID: \"b8398a92-1446-4d7f-b776-ba94565d42ee\") " Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.721082 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plltk\" (UniqueName: \"kubernetes.io/projected/b8398a92-1446-4d7f-b776-ba94565d42ee-kube-api-access-plltk\") pod \"b8398a92-1446-4d7f-b776-ba94565d42ee\" (UID: \"b8398a92-1446-4d7f-b776-ba94565d42ee\") " Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.721141 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8398a92-1446-4d7f-b776-ba94565d42ee-dns-svc\") pod \"b8398a92-1446-4d7f-b776-ba94565d42ee\" (UID: \"b8398a92-1446-4d7f-b776-ba94565d42ee\") " Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.722444 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8398a92-1446-4d7f-b776-ba94565d42ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8398a92-1446-4d7f-b776-ba94565d42ee" (UID: "b8398a92-1446-4d7f-b776-ba94565d42ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.722759 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7533a2b3-c9f8-4594-ba41-f200d085e0de-config" (OuterVolumeSpecName: "config") pod "7533a2b3-c9f8-4594-ba41-f200d085e0de" (UID: "7533a2b3-c9f8-4594-ba41-f200d085e0de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.722846 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8398a92-1446-4d7f-b776-ba94565d42ee-config" (OuterVolumeSpecName: "config") pod "b8398a92-1446-4d7f-b776-ba94565d42ee" (UID: "b8398a92-1446-4d7f-b776-ba94565d42ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.727225 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7533a2b3-c9f8-4594-ba41-f200d085e0de-kube-api-access-twl8f" (OuterVolumeSpecName: "kube-api-access-twl8f") pod "7533a2b3-c9f8-4594-ba41-f200d085e0de" (UID: "7533a2b3-c9f8-4594-ba41-f200d085e0de"). InnerVolumeSpecName "kube-api-access-twl8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.728242 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8398a92-1446-4d7f-b776-ba94565d42ee-kube-api-access-plltk" (OuterVolumeSpecName: "kube-api-access-plltk") pod "b8398a92-1446-4d7f-b776-ba94565d42ee" (UID: "b8398a92-1446-4d7f-b776-ba94565d42ee"). InnerVolumeSpecName "kube-api-access-plltk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.823063 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twl8f\" (UniqueName: \"kubernetes.io/projected/7533a2b3-c9f8-4594-ba41-f200d085e0de-kube-api-access-twl8f\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.823097 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8398a92-1446-4d7f-b776-ba94565d42ee-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.823109 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plltk\" (UniqueName: \"kubernetes.io/projected/b8398a92-1446-4d7f-b776-ba94565d42ee-kube-api-access-plltk\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.823173 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8398a92-1446-4d7f-b776-ba94565d42ee-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:02 crc kubenswrapper[4756]: I0318 14:20:02.823181 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7533a2b3-c9f8-4594-ba41-f200d085e0de-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:03 crc kubenswrapper[4756]: I0318 14:20:03.002215 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6wghm"] Mar 18 14:20:03 crc kubenswrapper[4756]: I0318 14:20:03.005468 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6wghm"] Mar 18 14:20:03 crc kubenswrapper[4756]: I0318 14:20:03.025654 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fvc2v"] Mar 18 14:20:03 crc kubenswrapper[4756]: I0318 14:20:03.030201 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fvc2v"] Mar 18 14:20:03 crc kubenswrapper[4756]: I0318 14:20:03.325151 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7533a2b3-c9f8-4594-ba41-f200d085e0de" path="/var/lib/kubelet/pods/7533a2b3-c9f8-4594-ba41-f200d085e0de/volumes" Mar 18 14:20:03 crc kubenswrapper[4756]: I0318 14:20:03.325513 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8398a92-1446-4d7f-b776-ba94565d42ee" path="/var/lib/kubelet/pods/b8398a92-1446-4d7f-b776-ba94565d42ee/volumes" Mar 18 14:20:08 crc kubenswrapper[4756]: I0318 14:20:08.516468 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:20:08 crc kubenswrapper[4756]: I0318 14:20:08.794369 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:20:08 crc kubenswrapper[4756]: I0318 14:20:08.839551 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-b7mcw"] Mar 18 14:20:08 crc kubenswrapper[4756]: I0318 14:20:08.839888 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" podUID="8994855c-7079-4927-ac5d-1d875288634d" containerName="dnsmasq-dns" containerID="cri-o://0ab432c8867b24b7c18085a28381de472c337873630cc7faa2b863a5c84b4a04" gracePeriod=10 Mar 18 14:20:10 crc kubenswrapper[4756]: I0318 14:20:10.741896 4756 generic.go:334] "Generic (PLEG): container finished" podID="8994855c-7079-4927-ac5d-1d875288634d" containerID="0ab432c8867b24b7c18085a28381de472c337873630cc7faa2b863a5c84b4a04" exitCode=0 Mar 18 14:20:10 crc kubenswrapper[4756]: I0318 14:20:10.741974 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" event={"ID":"8994855c-7079-4927-ac5d-1d875288634d","Type":"ContainerDied","Data":"0ab432c8867b24b7c18085a28381de472c337873630cc7faa2b863a5c84b4a04"} Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.212104 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-gkrbd"] Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.213688 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.215833 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.230738 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-gkrbd"] Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.344208 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac7f17bf-2987-447a-a61b-c0b97615ced5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.344287 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s96dq\" (UniqueName: \"kubernetes.io/projected/ac7f17bf-2987-447a-a61b-c0b97615ced5-kube-api-access-s96dq\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.344318 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ac7f17bf-2987-447a-a61b-c0b97615ced5-ovs-rundir\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.344354 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac7f17bf-2987-447a-a61b-c0b97615ced5-config\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.344383 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-pfbm4"] Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.344792 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ac7f17bf-2987-447a-a61b-c0b97615ced5-ovn-rundir\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.344824 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac7f17bf-2987-447a-a61b-c0b97615ced5-combined-ca-bundle\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.349446 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.352242 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.366078 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-pfbm4"] Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.445939 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac7f17bf-2987-447a-a61b-c0b97615ced5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.446029 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s96dq\" (UniqueName: \"kubernetes.io/projected/ac7f17bf-2987-447a-a61b-c0b97615ced5-kube-api-access-s96dq\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.446062 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ac7f17bf-2987-447a-a61b-c0b97615ced5-ovs-rundir\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.446098 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac7f17bf-2987-447a-a61b-c0b97615ced5-config\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.446141 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ac7f17bf-2987-447a-a61b-c0b97615ced5-ovn-rundir\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.446165 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac7f17bf-2987-447a-a61b-c0b97615ced5-combined-ca-bundle\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.446649 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ac7f17bf-2987-447a-a61b-c0b97615ced5-ovs-rundir\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.446663 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ac7f17bf-2987-447a-a61b-c0b97615ced5-ovn-rundir\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.447159 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac7f17bf-2987-447a-a61b-c0b97615ced5-config\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.453748 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac7f17bf-2987-447a-a61b-c0b97615ced5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.453831 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac7f17bf-2987-447a-a61b-c0b97615ced5-combined-ca-bundle\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.461857 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s96dq\" (UniqueName: \"kubernetes.io/projected/ac7f17bf-2987-447a-a61b-c0b97615ced5-kube-api-access-s96dq\") pod \"ovn-controller-metrics-gkrbd\" (UID: \"ac7f17bf-2987-447a-a61b-c0b97615ced5\") " pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.552435 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gkrbd" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.553834 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-pfbm4\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.553977 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-pfbm4\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.554020 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sq6q\" (UniqueName: \"kubernetes.io/projected/8f04d7df-0cba-4b47-a033-404303aa6e40-kube-api-access-8sq6q\") pod \"dnsmasq-dns-7fd796d7df-pfbm4\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.554097 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-config\") pod \"dnsmasq-dns-7fd796d7df-pfbm4\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.603819 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-pfbm4"] Mar 18 14:20:17 crc kubenswrapper[4756]: E0318 14:20:17.604617 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-8sq6q ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" podUID="8f04d7df-0cba-4b47-a033-404303aa6e40" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.629139 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bdzg6"] Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.647999 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.653811 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.661059 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-config\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.661253 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.661418 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-config\") pod \"dnsmasq-dns-7fd796d7df-pfbm4\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.661840 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-pfbm4\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.662383 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.662824 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-pfbm4\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.662835 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x59pj\" (UniqueName: \"kubernetes.io/projected/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-kube-api-access-x59pj\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.663169 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.663316 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-pfbm4\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.663368 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sq6q\" (UniqueName: \"kubernetes.io/projected/8f04d7df-0cba-4b47-a033-404303aa6e40-kube-api-access-8sq6q\") pod \"dnsmasq-dns-7fd796d7df-pfbm4\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.664733 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-config\") pod \"dnsmasq-dns-7fd796d7df-pfbm4\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.664926 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-pfbm4\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.677704 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bdzg6"] Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.684077 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sq6q\" (UniqueName: \"kubernetes.io/projected/8f04d7df-0cba-4b47-a033-404303aa6e40-kube-api-access-8sq6q\") pod \"dnsmasq-dns-7fd796d7df-pfbm4\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.765032 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.765112 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-config\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.765164 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.765224 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.765244 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x59pj\" (UniqueName: \"kubernetes.io/projected/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-kube-api-access-x59pj\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.766150 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.766427 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.766691 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-config\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.766707 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.781693 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x59pj\" (UniqueName: \"kubernetes.io/projected/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-kube-api-access-x59pj\") pod \"dnsmasq-dns-86db49b7ff-bdzg6\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.833928 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.843650 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.865803 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-ovsdbserver-nb\") pod \"8f04d7df-0cba-4b47-a033-404303aa6e40\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.865834 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-config\") pod \"8f04d7df-0cba-4b47-a033-404303aa6e40\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.865857 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sq6q\" (UniqueName: \"kubernetes.io/projected/8f04d7df-0cba-4b47-a033-404303aa6e40-kube-api-access-8sq6q\") pod \"8f04d7df-0cba-4b47-a033-404303aa6e40\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.865969 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-dns-svc\") pod \"8f04d7df-0cba-4b47-a033-404303aa6e40\" (UID: \"8f04d7df-0cba-4b47-a033-404303aa6e40\") " Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.866258 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8f04d7df-0cba-4b47-a033-404303aa6e40" (UID: "8f04d7df-0cba-4b47-a033-404303aa6e40"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.866415 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8f04d7df-0cba-4b47-a033-404303aa6e40" (UID: "8f04d7df-0cba-4b47-a033-404303aa6e40"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.866454 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-config" (OuterVolumeSpecName: "config") pod "8f04d7df-0cba-4b47-a033-404303aa6e40" (UID: "8f04d7df-0cba-4b47-a033-404303aa6e40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.869468 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f04d7df-0cba-4b47-a033-404303aa6e40-kube-api-access-8sq6q" (OuterVolumeSpecName: "kube-api-access-8sq6q") pod "8f04d7df-0cba-4b47-a033-404303aa6e40" (UID: "8f04d7df-0cba-4b47-a033-404303aa6e40"). InnerVolumeSpecName "kube-api-access-8sq6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.967276 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.967582 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.967593 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f04d7df-0cba-4b47-a033-404303aa6e40-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.967602 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sq6q\" (UniqueName: \"kubernetes.io/projected/8f04d7df-0cba-4b47-a033-404303aa6e40-kube-api-access-8sq6q\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:17 crc kubenswrapper[4756]: I0318 14:20:17.972066 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.072549 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.072772 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n664h578h675h5b5h8fhbdhffh79h59fh9ch545h5bhb6h58dh57dh54bh666hd8h58bh598h677h588h554h59bh59h5fchb9hc4hc9h5f8h5c9h66dq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mwtrs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-vsj9n_openstack(3f5c347c-244f-40b6-8311-8eac0e22626a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.073910 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-vsj9n" podUID="3f5c347c-244f-40b6-8311-8eac0e22626a" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.135693 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.135897 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-query-frontend,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30,Command:[],Args:[-target=query-frontend -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-query-frontend-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-query-frontend-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4q6m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-query-frontend-6f54889599-4pgf5_openstack(4999c7cd-7963-42e4-8404-a0203664d331): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.138238 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-query-frontend\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" podUID="4999c7cd-7963-42e4-8404-a0203664d331" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.151809 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.152204 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-compactor,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30,Command:[],Args:[-target=compactor -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g8tld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-compactor-0_openstack(06157d4c-39c0-4895-b002-79ee7b960512): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.153439 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="06157d4c-39c0-4895-b002-79ee7b960512" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.167556 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.167725 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-distributor,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30,Command:[],Args:[-target=distributor -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-distributor-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-distributor-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l7lgr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd_openstack(e7acb694-7937-45a7-8aab-c2175fae6423): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.169006 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" podUID="e7acb694-7937-45a7-8aab-c2175fae6423" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.201304 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.201520 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-index-gateway,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30,Command:[],Args:[-target=index-gateway -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-index-gateway-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-index-gateway-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4vj65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-index-gateway-0_openstack(3528f495-dffb-47d3-99fe-69054008e8cd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.205221 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-index-gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" podUID="3528f495-dffb-47d3-99fe-69054008e8cd" Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.345505 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.378767 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8994855c-7079-4927-ac5d-1d875288634d-dns-svc\") pod \"8994855c-7079-4927-ac5d-1d875288634d\" (UID: \"8994855c-7079-4927-ac5d-1d875288634d\") " Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.379020 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzz7v\" (UniqueName: \"kubernetes.io/projected/8994855c-7079-4927-ac5d-1d875288634d-kube-api-access-nzz7v\") pod \"8994855c-7079-4927-ac5d-1d875288634d\" (UID: \"8994855c-7079-4927-ac5d-1d875288634d\") " Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.379041 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8994855c-7079-4927-ac5d-1d875288634d-config\") pod \"8994855c-7079-4927-ac5d-1d875288634d\" (UID: \"8994855c-7079-4927-ac5d-1d875288634d\") " Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.385682 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8994855c-7079-4927-ac5d-1d875288634d-kube-api-access-nzz7v" (OuterVolumeSpecName: "kube-api-access-nzz7v") pod "8994855c-7079-4927-ac5d-1d875288634d" (UID: "8994855c-7079-4927-ac5d-1d875288634d"). InnerVolumeSpecName "kube-api-access-nzz7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.420320 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8994855c-7079-4927-ac5d-1d875288634d-config" (OuterVolumeSpecName: "config") pod "8994855c-7079-4927-ac5d-1d875288634d" (UID: "8994855c-7079-4927-ac5d-1d875288634d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.423680 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8994855c-7079-4927-ac5d-1d875288634d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8994855c-7079-4927-ac5d-1d875288634d" (UID: "8994855c-7079-4927-ac5d-1d875288634d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.481337 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8994855c-7079-4927-ac5d-1d875288634d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.481368 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzz7v\" (UniqueName: \"kubernetes.io/projected/8994855c-7079-4927-ac5d-1d875288634d-kube-api-access-nzz7v\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.481377 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8994855c-7079-4927-ac5d-1d875288634d-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.515478 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" podUID="8994855c-7079-4927-ac5d-1d875288634d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: i/o timeout" Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.844669 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" event={"ID":"8994855c-7079-4927-ac5d-1d875288634d","Type":"ContainerDied","Data":"465223d3c738fa6a4069898fb3a2802d0714b764ea033f9168e29d9d6b4764bf"} Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.844727 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-pfbm4" Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.844742 4756 scope.go:117] "RemoveContainer" containerID="0ab432c8867b24b7c18085a28381de472c337873630cc7faa2b863a5c84b4a04" Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.844834 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-b7mcw" Mar 18 14:20:18 crc kubenswrapper[4756]: E0318 14:20:18.849417 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified\\\"\"" pod="openstack/ovn-controller-ovs-vsj9n" podUID="3f5c347c-244f-40b6-8311-8eac0e22626a" Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.954400 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-pfbm4"] Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.965254 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-pfbm4"] Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.988450 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-b7mcw"] Mar 18 14:20:18 crc kubenswrapper[4756]: I0318 14:20:18.996573 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-b7mcw"] Mar 18 14:20:19 crc kubenswrapper[4756]: E0318 14:20:19.091354 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Mar 18 14:20:19 crc kubenswrapper[4756]: E0318 14:20:19.091566 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n664h578h675h5b5h8fhbdhffh79h59fh9ch545h5bhb6h58dh57dh54bh666hd8h58bh598h677h588h554h59bh59h5fchb9hc4hc9h5f8h5c9h66dq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ll9mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-r6s8c_openstack(99dfb896-59f3-4f93-8d0e-4b19b49cbc56): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 14:20:19 crc kubenswrapper[4756]: E0318 14:20:19.092736 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-r6s8c" podUID="99dfb896-59f3-4f93-8d0e-4b19b49cbc56" Mar 18 14:20:19 crc kubenswrapper[4756]: I0318 14:20:19.330030 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8994855c-7079-4927-ac5d-1d875288634d" path="/var/lib/kubelet/pods/8994855c-7079-4927-ac5d-1d875288634d/volumes" Mar 18 14:20:19 crc kubenswrapper[4756]: I0318 14:20:19.330830 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f04d7df-0cba-4b47-a033-404303aa6e40" path="/var/lib/kubelet/pods/8f04d7df-0cba-4b47-a033-404303aa6e40/volumes" Mar 18 14:20:19 crc kubenswrapper[4756]: I0318 14:20:19.700364 4756 scope.go:117] "RemoveContainer" containerID="d8f44e38dd13ddcebf48393e47e6f7edb28b586f4c3fd8adf21387278542e8a7" Mar 18 14:20:19 crc kubenswrapper[4756]: E0318 14:20:19.711470 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 18 14:20:19 crc kubenswrapper[4756]: E0318 14:20:19.711508 4756 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 18 14:20:19 crc kubenswrapper[4756]: E0318 14:20:19.711707 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-46x77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(67dc4771-f106-4f33-9f84-0d7251e4259d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 14:20:19 crc kubenswrapper[4756]: E0318 14:20:19.715513 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="67dc4771-f106-4f33-9f84-0d7251e4259d" Mar 18 14:20:19 crc kubenswrapper[4756]: E0318 14:20:19.890260 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-r6s8c" podUID="99dfb896-59f3-4f93-8d0e-4b19b49cbc56" Mar 18 14:20:19 crc kubenswrapper[4756]: E0318 14:20:19.897928 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="67dc4771-f106-4f33-9f84-0d7251e4259d" Mar 18 14:20:20 crc kubenswrapper[4756]: I0318 14:20:20.152180 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-gkrbd"] Mar 18 14:20:20 crc kubenswrapper[4756]: I0318 14:20:20.207508 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bdzg6"] Mar 18 14:20:20 crc kubenswrapper[4756]: W0318 14:20:20.499608 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded20a6fd_ef4b_43bc_bbb2_6cacdc8f6e3e.slice/crio-71ee9cf04a04abc3b311e069a8579c208d612cd229cd2b9f52c9d2ff9427d85c WatchSource:0}: Error finding container 71ee9cf04a04abc3b311e069a8579c208d612cd229cd2b9f52c9d2ff9427d85c: Status 404 returned error can't find the container with id 71ee9cf04a04abc3b311e069a8579c208d612cd229cd2b9f52c9d2ff9427d85c Mar 18 14:20:20 crc kubenswrapper[4756]: I0318 14:20:20.885599 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe","Type":"ContainerStarted","Data":"ef55c73a93eb95d1d3d9190fa436b9aa1651707e2154da9cad984e12a19a27e7"} Mar 18 14:20:20 crc kubenswrapper[4756]: I0318 14:20:20.888963 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"18faa7ad-0836-473a-aebe-0a6e5357b554","Type":"ContainerStarted","Data":"0cd93ecbd0711bdd6941046ea11cfcdfe0d8c32e85f5c75bcb83d249ab7a5336"} Mar 18 14:20:20 crc kubenswrapper[4756]: I0318 14:20:20.889139 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 14:20:20 crc kubenswrapper[4756]: I0318 14:20:20.891397 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gkrbd" event={"ID":"ac7f17bf-2987-447a-a61b-c0b97615ced5","Type":"ContainerStarted","Data":"746ffa8c26bc2a70ce2bcf9c81b3128042b532d25a714e55409f1d51eff21f26"} Mar 18 14:20:20 crc kubenswrapper[4756]: I0318 14:20:20.894606 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" event={"ID":"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e","Type":"ContainerStarted","Data":"71ee9cf04a04abc3b311e069a8579c208d612cd229cd2b9f52c9d2ff9427d85c"} Mar 18 14:20:20 crc kubenswrapper[4756]: I0318 14:20:20.950746 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=29.001259903 podStartE2EDuration="33.950724512s" podCreationTimestamp="2026-03-18 14:19:47 +0000 UTC" firstStartedPulling="2026-03-18 14:19:59.786321507 +0000 UTC m=+1201.100739492" lastFinishedPulling="2026-03-18 14:20:04.735786126 +0000 UTC m=+1206.050204101" observedRunningTime="2026-03-18 14:20:20.942752776 +0000 UTC m=+1222.257170771" watchObservedRunningTime="2026-03-18 14:20:20.950724512 +0000 UTC m=+1222.265142487" Mar 18 14:20:21 crc kubenswrapper[4756]: I0318 14:20:21.908101 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" event={"ID":"0e90a533-9bd6-4f94-a8c4-52218f2919b0","Type":"ContainerStarted","Data":"6b0bfce5b21a1587f5a071a7fa0c05bc462458f6c8a03851705745569e0cd84b"} Mar 18 14:20:21 crc kubenswrapper[4756]: I0318 14:20:21.908475 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:20:21 crc kubenswrapper[4756]: I0318 14:20:21.913278 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" event={"ID":"4999c7cd-7963-42e4-8404-a0203664d331","Type":"ContainerStarted","Data":"10607a8b4247d97e6cfbb8983fc04821d9de077ce3a8e6edb8b05640c838a399"} Mar 18 14:20:21 crc kubenswrapper[4756]: I0318 14:20:21.913479 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:20:21 crc kubenswrapper[4756]: I0318 14:20:21.916134 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8","Type":"ContainerStarted","Data":"e010be0eec2cc1725d7882a84a77ed19c2dc700c86a2f0bfd1d11cd3394a136e"} Mar 18 14:20:21 crc kubenswrapper[4756]: I0318 14:20:21.918266 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" event={"ID":"e7acb694-7937-45a7-8aab-c2175fae6423","Type":"ContainerStarted","Data":"5c66f52547bd528c1310f413e652ab8f14413dcbea16dcacc67f715678979160"} Mar 18 14:20:21 crc kubenswrapper[4756]: I0318 14:20:21.930286 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" podStartSLOduration=8.259823936 podStartE2EDuration="24.930269488s" podCreationTimestamp="2026-03-18 14:19:57 +0000 UTC" firstStartedPulling="2026-03-18 14:20:02.451412949 +0000 UTC m=+1203.765830924" lastFinishedPulling="2026-03-18 14:20:19.121858501 +0000 UTC m=+1220.436276476" observedRunningTime="2026-03-18 14:20:21.926316501 +0000 UTC m=+1223.240734516" watchObservedRunningTime="2026-03-18 14:20:21.930269488 +0000 UTC m=+1223.244687463" Mar 18 14:20:21 crc kubenswrapper[4756]: I0318 14:20:21.944000 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-p44r5" Mar 18 14:20:21 crc kubenswrapper[4756]: I0318 14:20:21.986835 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" podStartSLOduration=-9223372011.867964 podStartE2EDuration="24.986812047s" podCreationTimestamp="2026-03-18 14:19:57 +0000 UTC" firstStartedPulling="2026-03-18 14:20:02.45776004 +0000 UTC m=+1203.772178015" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:20:21.979250692 +0000 UTC m=+1223.293668677" watchObservedRunningTime="2026-03-18 14:20:21.986812047 +0000 UTC m=+1223.301230022" Mar 18 14:20:21 crc kubenswrapper[4756]: I0318 14:20:21.998093 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" podStartSLOduration=-9223372011.856703 podStartE2EDuration="24.998073111s" podCreationTimestamp="2026-03-18 14:19:57 +0000 UTC" firstStartedPulling="2026-03-18 14:20:02.466386403 +0000 UTC m=+1203.780804388" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:20:21.995932273 +0000 UTC m=+1223.310350258" watchObservedRunningTime="2026-03-18 14:20:21.998073111 +0000 UTC m=+1223.312491086" Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.936612 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"06157d4c-39c0-4895-b002-79ee7b960512","Type":"ContainerStarted","Data":"ea59af8e62ad4f5e519264b53797c51c5720d4c06011f86e27cace4908d34a49"} Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.937255 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.939933 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce","Type":"ContainerStarted","Data":"e66e72f8966d8210856953e9f931b81d3562014f300152eb8832f6595cff013c"} Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.942423 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"ca8fb9b6-a1a9-4781-af8f-2e7e78e62771","Type":"ContainerStarted","Data":"c53810fdb1696119047287d9b29df8a69e67152961016b0ebb0613370cc6edc5"} Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.943470 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.948302 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" event={"ID":"a3a9e126-317d-4f52-a4f8-e657dfa9930c","Type":"ContainerStarted","Data":"46376b7f3390aaceea70b7e1197fcfe87952a1d5c6aad098e587d0e2aa872b48"} Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.948554 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.952166 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ba7aaad4-94c3-4202-a512-a84cba9bcb9f","Type":"ContainerStarted","Data":"8bdd0e2cf5165311c02c004eb5882c9265eb6dac02828b6247c1895ba605f534"} Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.956418 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"3528f495-dffb-47d3-99fe-69054008e8cd","Type":"ContainerStarted","Data":"ed3ddfedb9cce0e53b31209f8abf9a3f5a1c0ecccc1935cf3bc91a13c0321fd2"} Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.957287 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.960245 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" event={"ID":"7edadc4c-acee-49a2-b629-c4505d40eebc","Type":"ContainerStarted","Data":"eef7889c7f3f09f01ec72585798f52eb75b61117ffb08e4492331edc459058a3"} Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.960409 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=-9223372010.894392 podStartE2EDuration="25.960383802s" podCreationTimestamp="2026-03-18 14:19:57 +0000 UTC" firstStartedPulling="2026-03-18 14:20:01.672661592 +0000 UTC m=+1202.987079567" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:20:22.954085961 +0000 UTC m=+1224.268503966" watchObservedRunningTime="2026-03-18 14:20:22.960383802 +0000 UTC m=+1224.274801807" Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.961045 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.966586 4756 generic.go:334] "Generic (PLEG): container finished" podID="d4432811-b291-4fac-a2e6-ad17c9d83f51" containerID="60d7ce43d6b4d49750a3f726650b715ceb1c8c79673e18a2165fd579133993d9" exitCode=0 Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.966727 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564060-mkctk" event={"ID":"d4432811-b291-4fac-a2e6-ad17c9d83f51","Type":"ContainerDied","Data":"60d7ce43d6b4d49750a3f726650b715ceb1c8c79673e18a2165fd579133993d9"} Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.968447 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.969466 4756 generic.go:334] "Generic (PLEG): container finished" podID="ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" containerID="32b703f57f97e642ba696479c2250e513ef0d1c5be17e0d34ef97684e254388d" exitCode=0 Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.972178 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" event={"ID":"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e","Type":"ContainerDied","Data":"32b703f57f97e642ba696479c2250e513ef0d1c5be17e0d34ef97684e254388d"} Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.974688 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g" podStartSLOduration=8.72970071 podStartE2EDuration="25.974665118s" podCreationTimestamp="2026-03-18 14:19:57 +0000 UTC" firstStartedPulling="2026-03-18 14:20:02.464010769 +0000 UTC m=+1203.778428744" lastFinishedPulling="2026-03-18 14:20:19.708975177 +0000 UTC m=+1221.023393152" observedRunningTime="2026-03-18 14:20:22.973202028 +0000 UTC m=+1224.287620043" watchObservedRunningTime="2026-03-18 14:20:22.974665118 +0000 UTC m=+1224.289083113" Mar 18 14:20:22 crc kubenswrapper[4756]: I0318 14:20:22.994912 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=7.926655198 podStartE2EDuration="25.994894835s" podCreationTimestamp="2026-03-18 14:19:57 +0000 UTC" firstStartedPulling="2026-03-18 14:20:01.647352478 +0000 UTC m=+1202.961770453" lastFinishedPulling="2026-03-18 14:20:19.715592115 +0000 UTC m=+1221.030010090" observedRunningTime="2026-03-18 14:20:22.993019344 +0000 UTC m=+1224.307437319" watchObservedRunningTime="2026-03-18 14:20:22.994894835 +0000 UTC m=+1224.309312810" Mar 18 14:20:23 crc kubenswrapper[4756]: I0318 14:20:23.113743 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" podStartSLOduration=9.447487018 podStartE2EDuration="26.113719167s" podCreationTimestamp="2026-03-18 14:19:57 +0000 UTC" firstStartedPulling="2026-03-18 14:20:02.454996505 +0000 UTC m=+1203.769414490" lastFinishedPulling="2026-03-18 14:20:19.121228664 +0000 UTC m=+1220.435646639" observedRunningTime="2026-03-18 14:20:23.105398002 +0000 UTC m=+1224.419815997" watchObservedRunningTime="2026-03-18 14:20:23.113719167 +0000 UTC m=+1224.428137142" Mar 18 14:20:23 crc kubenswrapper[4756]: I0318 14:20:23.979826 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:20:23 crc kubenswrapper[4756]: I0318 14:20:23.984948 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"228ca85e-a493-4dc4-9b95-5148c92ba228","Type":"ContainerStarted","Data":"9e1cdfeacb5d741083531a9b44627ebef31f79c1b82b6e7e2e4feb1175a89cf9"} Mar 18 14:20:23 crc kubenswrapper[4756]: I0318 14:20:23.986831 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"529ae791-8631-4ca5-9e4b-bb857d6264a8","Type":"ContainerStarted","Data":"1b86114c70b8e3987a8b62fafaa77d3c5b209234cba6f34519808c6660a316ba"} Mar 18 14:20:23 crc kubenswrapper[4756]: I0318 14:20:23.989045 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d691251-328b-4c09-98d1-4b968ab5bc05","Type":"ContainerStarted","Data":"a1ff35615a64ea0b0976834b6197c47f1c1b410ff679b5b20035e885f047d65c"} Mar 18 14:20:23 crc kubenswrapper[4756]: I0318 14:20:23.991289 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4","Type":"ContainerStarted","Data":"31d5c7a181782071956cca736972b51305ecc85093231ffa7b86ca1629fcdf23"} Mar 18 14:20:24 crc kubenswrapper[4756]: I0318 14:20:24.012403 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=-9223372009.842411 podStartE2EDuration="27.012365416s" podCreationTimestamp="2026-03-18 14:19:57 +0000 UTC" firstStartedPulling="2026-03-18 14:20:02.430630956 +0000 UTC m=+1203.745048971" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:20:23.12340688 +0000 UTC m=+1224.437824855" watchObservedRunningTime="2026-03-18 14:20:24.012365416 +0000 UTC m=+1225.326783391" Mar 18 14:20:24 crc kubenswrapper[4756]: I0318 14:20:24.474900 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564060-mkctk" Mar 18 14:20:24 crc kubenswrapper[4756]: I0318 14:20:24.515385 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vdzb\" (UniqueName: \"kubernetes.io/projected/d4432811-b291-4fac-a2e6-ad17c9d83f51-kube-api-access-5vdzb\") pod \"d4432811-b291-4fac-a2e6-ad17c9d83f51\" (UID: \"d4432811-b291-4fac-a2e6-ad17c9d83f51\") " Mar 18 14:20:24 crc kubenswrapper[4756]: I0318 14:20:24.525106 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4432811-b291-4fac-a2e6-ad17c9d83f51-kube-api-access-5vdzb" (OuterVolumeSpecName: "kube-api-access-5vdzb") pod "d4432811-b291-4fac-a2e6-ad17c9d83f51" (UID: "d4432811-b291-4fac-a2e6-ad17c9d83f51"). InnerVolumeSpecName "kube-api-access-5vdzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:24 crc kubenswrapper[4756]: I0318 14:20:24.618047 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vdzb\" (UniqueName: \"kubernetes.io/projected/d4432811-b291-4fac-a2e6-ad17c9d83f51-kube-api-access-5vdzb\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:24 crc kubenswrapper[4756]: I0318 14:20:24.998889 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564060-mkctk" event={"ID":"d4432811-b291-4fac-a2e6-ad17c9d83f51","Type":"ContainerDied","Data":"3231feb74a862ffdb822b39057816aaa4476508d68101ef6629229c10856de16"} Mar 18 14:20:24 crc kubenswrapper[4756]: I0318 14:20:24.999206 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3231feb74a862ffdb822b39057816aaa4476508d68101ef6629229c10856de16" Mar 18 14:20:25 crc kubenswrapper[4756]: I0318 14:20:24.999257 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564060-mkctk" Mar 18 14:20:25 crc kubenswrapper[4756]: I0318 14:20:25.010275 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ee4b1cdc-8b62-42b6-9bc7-61164f90afb4","Type":"ContainerStarted","Data":"a761440b305d99b9837b10ec09fcd8005b4e4b56be755273719fb8b7f84c7990"} Mar 18 14:20:25 crc kubenswrapper[4756]: I0318 14:20:25.024187 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:25 crc kubenswrapper[4756]: I0318 14:20:25.032180 4756 generic.go:334] "Generic (PLEG): container finished" podID="7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe" containerID="ef55c73a93eb95d1d3d9190fa436b9aa1651707e2154da9cad984e12a19a27e7" exitCode=0 Mar 18 14:20:25 crc kubenswrapper[4756]: I0318 14:20:25.032259 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe","Type":"ContainerDied","Data":"ef55c73a93eb95d1d3d9190fa436b9aa1651707e2154da9cad984e12a19a27e7"} Mar 18 14:20:25 crc kubenswrapper[4756]: I0318 14:20:25.038046 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.903487592 podStartE2EDuration="29.038026339s" podCreationTimestamp="2026-03-18 14:19:56 +0000 UTC" firstStartedPulling="2026-03-18 14:20:02.588938177 +0000 UTC m=+1203.903356152" lastFinishedPulling="2026-03-18 14:20:24.723476924 +0000 UTC m=+1226.037894899" observedRunningTime="2026-03-18 14:20:25.026732424 +0000 UTC m=+1226.341150419" watchObservedRunningTime="2026-03-18 14:20:25.038026339 +0000 UTC m=+1226.352444314" Mar 18 14:20:25 crc kubenswrapper[4756]: I0318 14:20:25.054880 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" podStartSLOduration=8.054848194 podStartE2EDuration="8.054848194s" podCreationTimestamp="2026-03-18 14:20:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:20:25.049382916 +0000 UTC m=+1226.363800891" watchObservedRunningTime="2026-03-18 14:20:25.054848194 +0000 UTC m=+1226.369266169" Mar 18 14:20:25 crc kubenswrapper[4756]: I0318 14:20:25.508478 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.21633547 podStartE2EDuration="33.508454569s" podCreationTimestamp="2026-03-18 14:19:52 +0000 UTC" firstStartedPulling="2026-03-18 14:20:02.431463288 +0000 UTC m=+1203.745881263" lastFinishedPulling="2026-03-18 14:20:24.723582387 +0000 UTC m=+1226.038000362" observedRunningTime="2026-03-18 14:20:25.090678103 +0000 UTC m=+1226.405096078" watchObservedRunningTime="2026-03-18 14:20:25.508454569 +0000 UTC m=+1226.822872544" Mar 18 14:20:25 crc kubenswrapper[4756]: I0318 14:20:25.557922 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564054-kbzrm"] Mar 18 14:20:25 crc kubenswrapper[4756]: I0318 14:20:25.566900 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564054-kbzrm"] Mar 18 14:20:26 crc kubenswrapper[4756]: I0318 14:20:26.051455 4756 generic.go:334] "Generic (PLEG): container finished" podID="b5d3bcfe-0ae1-4104-8433-ffb4569a29d8" containerID="e010be0eec2cc1725d7882a84a77ed19c2dc700c86a2f0bfd1d11cd3394a136e" exitCode=0 Mar 18 14:20:26 crc kubenswrapper[4756]: I0318 14:20:26.051580 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8","Type":"ContainerDied","Data":"e010be0eec2cc1725d7882a84a77ed19c2dc700c86a2f0bfd1d11cd3394a136e"} Mar 18 14:20:26 crc kubenswrapper[4756]: I0318 14:20:26.055634 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" event={"ID":"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e","Type":"ContainerStarted","Data":"728e6a6487d03eeb4a1fd1c97ea905fce1d3e948f5b34eea1963dde9eac33b57"} Mar 18 14:20:26 crc kubenswrapper[4756]: I0318 14:20:26.058791 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe","Type":"ContainerStarted","Data":"d78bae6b1b71de09ad1d786827aef7d00d591925ac85c0aa76727d6e4e988d1d"} Mar 18 14:20:26 crc kubenswrapper[4756]: I0318 14:20:26.061465 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ba7aaad4-94c3-4202-a512-a84cba9bcb9f","Type":"ContainerStarted","Data":"2a69c3c2e9c2ddb9aefd4ae4be9de68809eec4859b7af5ed90b69516ae99660b"} Mar 18 14:20:26 crc kubenswrapper[4756]: I0318 14:20:26.071363 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gkrbd" event={"ID":"ac7f17bf-2987-447a-a61b-c0b97615ced5","Type":"ContainerStarted","Data":"e1f3077db6f0edf55e67c65fcdd2dcb800fc81afc29d12307b2a48564e39ff50"} Mar 18 14:20:26 crc kubenswrapper[4756]: I0318 14:20:26.123219 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-gkrbd" podStartSLOduration=4.619077814 podStartE2EDuration="9.123190551s" podCreationTimestamp="2026-03-18 14:20:17 +0000 UTC" firstStartedPulling="2026-03-18 14:20:20.218825423 +0000 UTC m=+1221.533243398" lastFinishedPulling="2026-03-18 14:20:24.72293816 +0000 UTC m=+1226.037356135" observedRunningTime="2026-03-18 14:20:26.112988025 +0000 UTC m=+1227.427406010" watchObservedRunningTime="2026-03-18 14:20:26.123190551 +0000 UTC m=+1227.437608586" Mar 18 14:20:26 crc kubenswrapper[4756]: I0318 14:20:26.171860 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=24.474018744 podStartE2EDuration="42.171841476s" podCreationTimestamp="2026-03-18 14:19:44 +0000 UTC" firstStartedPulling="2026-03-18 14:20:00.841052816 +0000 UTC m=+1202.155470791" lastFinishedPulling="2026-03-18 14:20:18.538875558 +0000 UTC m=+1219.853293523" observedRunningTime="2026-03-18 14:20:26.145761721 +0000 UTC m=+1227.460179696" watchObservedRunningTime="2026-03-18 14:20:26.171841476 +0000 UTC m=+1227.486259451" Mar 18 14:20:26 crc kubenswrapper[4756]: I0318 14:20:26.220165 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 14:20:26 crc kubenswrapper[4756]: I0318 14:20:26.220239 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 14:20:27 crc kubenswrapper[4756]: I0318 14:20:27.085857 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b5d3bcfe-0ae1-4104-8433-ffb4569a29d8","Type":"ContainerStarted","Data":"76c329183710ec19a4ef2b4d9fd5d0e14795aaa503eb602b3464442627e07ccd"} Mar 18 14:20:27 crc kubenswrapper[4756]: I0318 14:20:27.112496 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.756190485 podStartE2EDuration="41.11246453s" podCreationTimestamp="2026-03-18 14:19:46 +0000 UTC" firstStartedPulling="2026-03-18 14:20:00.986681504 +0000 UTC m=+1202.301099469" lastFinishedPulling="2026-03-18 14:20:19.342955539 +0000 UTC m=+1220.657373514" observedRunningTime="2026-03-18 14:20:27.10432339 +0000 UTC m=+1228.418741375" watchObservedRunningTime="2026-03-18 14:20:27.11246453 +0000 UTC m=+1228.426882545" Mar 18 14:20:27 crc kubenswrapper[4756]: I0318 14:20:27.147981 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 14:20:27 crc kubenswrapper[4756]: I0318 14:20:27.184414 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 14:20:27 crc kubenswrapper[4756]: I0318 14:20:27.332165 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1a89d50-a68b-4237-8259-af1876fd0f8e" path="/var/lib/kubelet/pods/c1a89d50-a68b-4237-8259-af1876fd0f8e/volumes" Mar 18 14:20:27 crc kubenswrapper[4756]: I0318 14:20:27.437403 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 14:20:27 crc kubenswrapper[4756]: I0318 14:20:27.437573 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 14:20:27 crc kubenswrapper[4756]: I0318 14:20:27.691610 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:20:27 crc kubenswrapper[4756]: I0318 14:20:27.776862 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 14:20:27 crc kubenswrapper[4756]: I0318 14:20:27.966537 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 14:20:27 crc kubenswrapper[4756]: I0318 14:20:27.966594 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.012784 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.094878 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.137112 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.137188 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.503885 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 14:20:28 crc kubenswrapper[4756]: E0318 14:20:28.504271 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8994855c-7079-4927-ac5d-1d875288634d" containerName="init" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.504287 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8994855c-7079-4927-ac5d-1d875288634d" containerName="init" Mar 18 14:20:28 crc kubenswrapper[4756]: E0318 14:20:28.504321 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8994855c-7079-4927-ac5d-1d875288634d" containerName="dnsmasq-dns" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.504328 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8994855c-7079-4927-ac5d-1d875288634d" containerName="dnsmasq-dns" Mar 18 14:20:28 crc kubenswrapper[4756]: E0318 14:20:28.504340 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4432811-b291-4fac-a2e6-ad17c9d83f51" containerName="oc" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.504348 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4432811-b291-4fac-a2e6-ad17c9d83f51" containerName="oc" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.504550 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8994855c-7079-4927-ac5d-1d875288634d" containerName="dnsmasq-dns" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.504571 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4432811-b291-4fac-a2e6-ad17c9d83f51" containerName="oc" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.505806 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.507493 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.508230 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4p4sc" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.508361 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.508658 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.537501 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.591634 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.591708 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.591757 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwbpw\" (UniqueName: \"kubernetes.io/projected/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-kube-api-access-wwbpw\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.591784 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-scripts\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.591862 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-config\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.591883 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.591948 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.693267 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.693344 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwbpw\" (UniqueName: \"kubernetes.io/projected/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-kube-api-access-wwbpw\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.693372 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-scripts\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.693419 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-config\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.693440 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.693468 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.693512 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.694225 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.695354 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-config\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.695393 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-scripts\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.699008 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.699500 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.700900 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.710690 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwbpw\" (UniqueName: \"kubernetes.io/projected/ac919d8c-6a4e-4239-a76c-5cbefcd01ce6-kube-api-access-wwbpw\") pod \"ovn-northd-0\" (UID: \"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6\") " pod="openstack/ovn-northd-0" Mar 18 14:20:28 crc kubenswrapper[4756]: I0318 14:20:28.822821 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 14:20:29 crc kubenswrapper[4756]: I0318 14:20:29.137955 4756 generic.go:334] "Generic (PLEG): container finished" podID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerID="a1ff35615a64ea0b0976834b6197c47f1c1b410ff679b5b20035e885f047d65c" exitCode=0 Mar 18 14:20:29 crc kubenswrapper[4756]: I0318 14:20:29.138183 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d691251-328b-4c09-98d1-4b968ab5bc05","Type":"ContainerDied","Data":"a1ff35615a64ea0b0976834b6197c47f1c1b410ff679b5b20035e885f047d65c"} Mar 18 14:20:29 crc kubenswrapper[4756]: I0318 14:20:29.162465 4756 generic.go:334] "Generic (PLEG): container finished" podID="529ae791-8631-4ca5-9e4b-bb857d6264a8" containerID="1b86114c70b8e3987a8b62fafaa77d3c5b209234cba6f34519808c6660a316ba" exitCode=0 Mar 18 14:20:29 crc kubenswrapper[4756]: I0318 14:20:29.163586 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"529ae791-8631-4ca5-9e4b-bb857d6264a8","Type":"ContainerDied","Data":"1b86114c70b8e3987a8b62fafaa77d3c5b209234cba6f34519808c6660a316ba"} Mar 18 14:20:29 crc kubenswrapper[4756]: I0318 14:20:29.297502 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 14:20:29 crc kubenswrapper[4756]: I0318 14:20:29.926554 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bdzg6"] Mar 18 14:20:29 crc kubenswrapper[4756]: I0318 14:20:29.927375 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" podUID="ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" containerName="dnsmasq-dns" containerID="cri-o://728e6a6487d03eeb4a1fd1c97ea905fce1d3e948f5b34eea1963dde9eac33b57" gracePeriod=10 Mar 18 14:20:29 crc kubenswrapper[4756]: I0318 14:20:29.945789 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:29 crc kubenswrapper[4756]: I0318 14:20:29.972992 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-tw5k4"] Mar 18 14:20:29 crc kubenswrapper[4756]: I0318 14:20:29.976382 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:29 crc kubenswrapper[4756]: I0318 14:20:29.987047 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-tw5k4"] Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.126231 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-dns-svc\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.126279 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-config\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.126401 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.126438 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxppw\" (UniqueName: \"kubernetes.io/projected/5feba85a-c965-4c24-a732-39ccc0ddbcf1-kube-api-access-jxppw\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.126487 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.170905 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6","Type":"ContainerStarted","Data":"1c9aaf6f729044c1eab158d75daf8edca1a391184c0f86c8fb21dcdfb377bbda"} Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.173817 4756 generic.go:334] "Generic (PLEG): container finished" podID="ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" containerID="728e6a6487d03eeb4a1fd1c97ea905fce1d3e948f5b34eea1963dde9eac33b57" exitCode=0 Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.173919 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" event={"ID":"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e","Type":"ContainerDied","Data":"728e6a6487d03eeb4a1fd1c97ea905fce1d3e948f5b34eea1963dde9eac33b57"} Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.227602 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.227648 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxppw\" (UniqueName: \"kubernetes.io/projected/5feba85a-c965-4c24-a732-39ccc0ddbcf1-kube-api-access-jxppw\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.227671 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.227745 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-config\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.227764 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-dns-svc\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.228621 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-dns-svc\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.228816 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.229315 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-config\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.229436 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.261385 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxppw\" (UniqueName: \"kubernetes.io/projected/5feba85a-c965-4c24-a732-39ccc0ddbcf1-kube-api-access-jxppw\") pod \"dnsmasq-dns-698758b865-tw5k4\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.298405 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.328947 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.464882 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.631076 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.740102 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x59pj\" (UniqueName: \"kubernetes.io/projected/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-kube-api-access-x59pj\") pod \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.740215 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-dns-svc\") pod \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.740340 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-ovsdbserver-sb\") pod \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.740393 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-config\") pod \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.740418 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-ovsdbserver-nb\") pod \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\" (UID: \"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e\") " Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.746034 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-kube-api-access-x59pj" (OuterVolumeSpecName: "kube-api-access-x59pj") pod "ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" (UID: "ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e"). InnerVolumeSpecName "kube-api-access-x59pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.782977 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-config" (OuterVolumeSpecName: "config") pod "ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" (UID: "ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.783068 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" (UID: "ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.786623 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" (UID: "ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.790050 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" (UID: "ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.841963 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.842004 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.842021 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x59pj\" (UniqueName: \"kubernetes.io/projected/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-kube-api-access-x59pj\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.842032 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.842045 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:30 crc kubenswrapper[4756]: I0318 14:20:30.912387 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-tw5k4"] Mar 18 14:20:31 crc kubenswrapper[4756]: W0318 14:20:31.059165 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5feba85a_c965_4c24_a732_39ccc0ddbcf1.slice/crio-244e8cc88a7ba0703e8e51c94dcabbacf3f6cb9febb6652c9b741ceecef71cd1 WatchSource:0}: Error finding container 244e8cc88a7ba0703e8e51c94dcabbacf3f6cb9febb6652c9b741ceecef71cd1: Status 404 returned error can't find the container with id 244e8cc88a7ba0703e8e51c94dcabbacf3f6cb9febb6652c9b741ceecef71cd1 Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.116847 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 18 14:20:31 crc kubenswrapper[4756]: E0318 14:20:31.117176 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" containerName="dnsmasq-dns" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.117187 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" containerName="dnsmasq-dns" Mar 18 14:20:31 crc kubenswrapper[4756]: E0318 14:20:31.117208 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" containerName="init" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.117214 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" containerName="init" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.117382 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" containerName="dnsmasq-dns" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.122835 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.128816 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.128863 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-whxc7" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.128929 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.129002 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.160805 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.212666 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-tw5k4" event={"ID":"5feba85a-c965-4c24-a732-39ccc0ddbcf1","Type":"ContainerStarted","Data":"244e8cc88a7ba0703e8e51c94dcabbacf3f6cb9febb6652c9b741ceecef71cd1"} Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.215880 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" event={"ID":"ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e","Type":"ContainerDied","Data":"71ee9cf04a04abc3b311e069a8579c208d612cd229cd2b9f52c9d2ff9427d85c"} Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.215943 4756 scope.go:117] "RemoveContainer" containerID="728e6a6487d03eeb4a1fd1c97ea905fce1d3e948f5b34eea1963dde9eac33b57" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.216069 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bdzg6" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.255234 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/717e4d16-f5d1-4367-ad0e-baf820923225-cache\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.255611 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq2db\" (UniqueName: \"kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-kube-api-access-dq2db\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.255650 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1169fccc-fed1-4c7a-a43c-7662ca42a03c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1169fccc-fed1-4c7a-a43c-7662ca42a03c\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.255981 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717e4d16-f5d1-4367-ad0e-baf820923225-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.256089 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/717e4d16-f5d1-4367-ad0e-baf820923225-lock\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.256131 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.272346 4756 scope.go:117] "RemoveContainer" containerID="32b703f57f97e642ba696479c2250e513ef0d1c5be17e0d34ef97684e254388d" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.296819 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bdzg6"] Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.303522 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bdzg6"] Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.329489 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e" path="/var/lib/kubelet/pods/ed20a6fd-ef4b-43bc-bbb2-6cacdc8f6e3e/volumes" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.359095 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717e4d16-f5d1-4367-ad0e-baf820923225-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.359165 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/717e4d16-f5d1-4367-ad0e-baf820923225-lock\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.359186 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: E0318 14:20:31.359492 4756 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 14:20:31 crc kubenswrapper[4756]: E0318 14:20:31.359525 4756 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 14:20:31 crc kubenswrapper[4756]: E0318 14:20:31.359603 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift podName:717e4d16-f5d1-4367-ad0e-baf820923225 nodeName:}" failed. No retries permitted until 2026-03-18 14:20:31.859579448 +0000 UTC m=+1233.173997423 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift") pod "swift-storage-0" (UID: "717e4d16-f5d1-4367-ad0e-baf820923225") : configmap "swift-ring-files" not found Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.359609 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/717e4d16-f5d1-4367-ad0e-baf820923225-lock\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.360304 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/717e4d16-f5d1-4367-ad0e-baf820923225-cache\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.360332 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq2db\" (UniqueName: \"kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-kube-api-access-dq2db\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.360363 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1169fccc-fed1-4c7a-a43c-7662ca42a03c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1169fccc-fed1-4c7a-a43c-7662ca42a03c\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.360771 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/717e4d16-f5d1-4367-ad0e-baf820923225-cache\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.363558 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.363592 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1169fccc-fed1-4c7a-a43c-7662ca42a03c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1169fccc-fed1-4c7a-a43c-7662ca42a03c\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1e8015d1d811614efd6ac35afce872726c492ce7bdecfa13284fcf6a125ad823/globalmount\"" pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.370041 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717e4d16-f5d1-4367-ad0e-baf820923225-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.385856 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq2db\" (UniqueName: \"kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-kube-api-access-dq2db\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.411801 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1169fccc-fed1-4c7a-a43c-7662ca42a03c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1169fccc-fed1-4c7a-a43c-7662ca42a03c\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: I0318 14:20:31.872612 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:31 crc kubenswrapper[4756]: E0318 14:20:31.872868 4756 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 14:20:31 crc kubenswrapper[4756]: E0318 14:20:31.873148 4756 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 14:20:31 crc kubenswrapper[4756]: E0318 14:20:31.873220 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift podName:717e4d16-f5d1-4367-ad0e-baf820923225 nodeName:}" failed. No retries permitted until 2026-03-18 14:20:32.873195495 +0000 UTC m=+1234.187613480 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift") pod "swift-storage-0" (UID: "717e4d16-f5d1-4367-ad0e-baf820923225") : configmap "swift-ring-files" not found Mar 18 14:20:32 crc kubenswrapper[4756]: I0318 14:20:32.254070 4756 generic.go:334] "Generic (PLEG): container finished" podID="5feba85a-c965-4c24-a732-39ccc0ddbcf1" containerID="eb33c1d4ad532d989ba7adbe90850f49596ea1c0cf12b6daf311e311b7a45432" exitCode=0 Mar 18 14:20:32 crc kubenswrapper[4756]: I0318 14:20:32.254402 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-tw5k4" event={"ID":"5feba85a-c965-4c24-a732-39ccc0ddbcf1","Type":"ContainerDied","Data":"eb33c1d4ad532d989ba7adbe90850f49596ea1c0cf12b6daf311e311b7a45432"} Mar 18 14:20:32 crc kubenswrapper[4756]: I0318 14:20:32.264542 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vsj9n" event={"ID":"3f5c347c-244f-40b6-8311-8eac0e22626a","Type":"ContainerStarted","Data":"dc7cd6569aa356e3f5406151088377cf77e1af96ec675c3809d2562822d182ae"} Mar 18 14:20:32 crc kubenswrapper[4756]: I0318 14:20:32.289747 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6","Type":"ContainerStarted","Data":"a0227394dece85f947d5fc81a5368eae9e9c8e32bec3b56b1eb9a8532f39c655"} Mar 18 14:20:32 crc kubenswrapper[4756]: I0318 14:20:32.290025 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac919d8c-6a4e-4239-a76c-5cbefcd01ce6","Type":"ContainerStarted","Data":"68b7c9da98a29b402d28f22ecc5c84f3b4c37125304562b22b5c76c26af3638e"} Mar 18 14:20:32 crc kubenswrapper[4756]: I0318 14:20:32.290222 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 14:20:32 crc kubenswrapper[4756]: I0318 14:20:32.334380 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.491243229 podStartE2EDuration="4.334362875s" podCreationTimestamp="2026-03-18 14:20:28 +0000 UTC" firstStartedPulling="2026-03-18 14:20:29.303817872 +0000 UTC m=+1230.618235847" lastFinishedPulling="2026-03-18 14:20:31.146937518 +0000 UTC m=+1232.461355493" observedRunningTime="2026-03-18 14:20:32.323430029 +0000 UTC m=+1233.637848004" watchObservedRunningTime="2026-03-18 14:20:32.334362875 +0000 UTC m=+1233.648780850" Mar 18 14:20:32 crc kubenswrapper[4756]: I0318 14:20:32.894313 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:32 crc kubenswrapper[4756]: E0318 14:20:32.894548 4756 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 14:20:32 crc kubenswrapper[4756]: E0318 14:20:32.894664 4756 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 14:20:32 crc kubenswrapper[4756]: E0318 14:20:32.894729 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift podName:717e4d16-f5d1-4367-ad0e-baf820923225 nodeName:}" failed. No retries permitted until 2026-03-18 14:20:34.894710976 +0000 UTC m=+1236.209128961 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift") pod "swift-storage-0" (UID: "717e4d16-f5d1-4367-ad0e-baf820923225") : configmap "swift-ring-files" not found Mar 18 14:20:33 crc kubenswrapper[4756]: I0318 14:20:33.299425 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-tw5k4" event={"ID":"5feba85a-c965-4c24-a732-39ccc0ddbcf1","Type":"ContainerStarted","Data":"c8105ff4eb910d4758dd4fae4a3311f0bc76a9446cfdcfbdf17d461a45e4b10a"} Mar 18 14:20:33 crc kubenswrapper[4756]: I0318 14:20:33.299551 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:33 crc kubenswrapper[4756]: I0318 14:20:33.302241 4756 generic.go:334] "Generic (PLEG): container finished" podID="3f5c347c-244f-40b6-8311-8eac0e22626a" containerID="dc7cd6569aa356e3f5406151088377cf77e1af96ec675c3809d2562822d182ae" exitCode=0 Mar 18 14:20:33 crc kubenswrapper[4756]: I0318 14:20:33.302311 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vsj9n" event={"ID":"3f5c347c-244f-40b6-8311-8eac0e22626a","Type":"ContainerDied","Data":"dc7cd6569aa356e3f5406151088377cf77e1af96ec675c3809d2562822d182ae"} Mar 18 14:20:33 crc kubenswrapper[4756]: I0318 14:20:33.304993 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r6s8c" event={"ID":"99dfb896-59f3-4f93-8d0e-4b19b49cbc56","Type":"ContainerStarted","Data":"0eb575803267de36343edee7424150b9f526a381d1e6eef1508a5dff0fdc21ea"} Mar 18 14:20:33 crc kubenswrapper[4756]: I0318 14:20:33.324463 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-tw5k4" podStartSLOduration=4.324447676 podStartE2EDuration="4.324447676s" podCreationTimestamp="2026-03-18 14:20:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:20:33.319141162 +0000 UTC m=+1234.633559137" watchObservedRunningTime="2026-03-18 14:20:33.324447676 +0000 UTC m=+1234.638865651" Mar 18 14:20:33 crc kubenswrapper[4756]: I0318 14:20:33.341999 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-r6s8c" podStartSLOduration=9.913679597 podStartE2EDuration="40.341983259s" podCreationTimestamp="2026-03-18 14:19:53 +0000 UTC" firstStartedPulling="2026-03-18 14:20:01.652192488 +0000 UTC m=+1202.966610463" lastFinishedPulling="2026-03-18 14:20:32.08049615 +0000 UTC m=+1233.394914125" observedRunningTime="2026-03-18 14:20:33.337231242 +0000 UTC m=+1234.651649207" watchObservedRunningTime="2026-03-18 14:20:33.341983259 +0000 UTC m=+1234.656401234" Mar 18 14:20:33 crc kubenswrapper[4756]: I0318 14:20:33.538148 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 14:20:33 crc kubenswrapper[4756]: I0318 14:20:33.620490 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 14:20:34 crc kubenswrapper[4756]: I0318 14:20:34.032889 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-r6s8c" Mar 18 14:20:34 crc kubenswrapper[4756]: I0318 14:20:34.316869 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vsj9n" event={"ID":"3f5c347c-244f-40b6-8311-8eac0e22626a","Type":"ContainerStarted","Data":"94eb82b5ddc151d46e8b25b6f662379017184ab8223e98db7431e84513c4589c"} Mar 18 14:20:34 crc kubenswrapper[4756]: I0318 14:20:34.936906 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:34 crc kubenswrapper[4756]: E0318 14:20:34.937138 4756 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 14:20:34 crc kubenswrapper[4756]: E0318 14:20:34.937260 4756 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 14:20:34 crc kubenswrapper[4756]: E0318 14:20:34.937313 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift podName:717e4d16-f5d1-4367-ad0e-baf820923225 nodeName:}" failed. No retries permitted until 2026-03-18 14:20:38.937298576 +0000 UTC m=+1240.251716541 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift") pod "swift-storage-0" (UID: "717e4d16-f5d1-4367-ad0e-baf820923225") : configmap "swift-ring-files" not found Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.012201 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-w2xct"] Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.016185 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w2xct" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.022720 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.023992 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-w2xct"] Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.034093 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-qfz9h"] Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.042464 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qfz9h"] Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.042566 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.048194 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.048382 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.048497 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.069241 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-qfz9h"] Mar 18 14:20:35 crc kubenswrapper[4756]: E0318 14:20:35.069864 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-k6bgz ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-k6bgz ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-qfz9h" podUID="c4a865bb-c4f8-4821-b57a-11ecfa8b3f16" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.087749 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2crm4"] Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.089217 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140583 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-combined-ca-bundle\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140647 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-scripts\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140666 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9-operator-scripts\") pod \"root-account-create-update-w2xct\" (UID: \"9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9\") " pod="openstack/root-account-create-update-w2xct" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140708 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-swiftconf\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140733 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbbf2\" (UniqueName: \"kubernetes.io/projected/00bfff2e-d59e-4936-b0e1-3476f2d01242-kube-api-access-nbbf2\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140749 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-dispersionconf\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140766 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-etc-swift\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140792 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-combined-ca-bundle\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140809 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/00bfff2e-d59e-4936-b0e1-3476f2d01242-etc-swift\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140835 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6bgz\" (UniqueName: \"kubernetes.io/projected/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-kube-api-access-k6bgz\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140856 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-dispersionconf\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140870 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-swiftconf\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140892 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00bfff2e-d59e-4936-b0e1-3476f2d01242-scripts\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140913 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/00bfff2e-d59e-4936-b0e1-3476f2d01242-ring-data-devices\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140944 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xzrt\" (UniqueName: \"kubernetes.io/projected/9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9-kube-api-access-5xzrt\") pod \"root-account-create-update-w2xct\" (UID: \"9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9\") " pod="openstack/root-account-create-update-w2xct" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.140960 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-ring-data-devices\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.141102 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2crm4"] Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.242549 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-combined-ca-bundle\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.242643 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-scripts\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.242669 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9-operator-scripts\") pod \"root-account-create-update-w2xct\" (UID: \"9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9\") " pod="openstack/root-account-create-update-w2xct" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.242727 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-swiftconf\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.242767 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbbf2\" (UniqueName: \"kubernetes.io/projected/00bfff2e-d59e-4936-b0e1-3476f2d01242-kube-api-access-nbbf2\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.242789 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-dispersionconf\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.242814 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-etc-swift\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.242853 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-combined-ca-bundle\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.242877 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/00bfff2e-d59e-4936-b0e1-3476f2d01242-etc-swift\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.242919 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6bgz\" (UniqueName: \"kubernetes.io/projected/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-kube-api-access-k6bgz\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.242952 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-dispersionconf\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.242971 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-swiftconf\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.243002 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00bfff2e-d59e-4936-b0e1-3476f2d01242-scripts\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.243031 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/00bfff2e-d59e-4936-b0e1-3476f2d01242-ring-data-devices\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.243076 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xzrt\" (UniqueName: \"kubernetes.io/projected/9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9-kube-api-access-5xzrt\") pod \"root-account-create-update-w2xct\" (UID: \"9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9\") " pod="openstack/root-account-create-update-w2xct" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.243099 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-ring-data-devices\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.243420 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9-operator-scripts\") pod \"root-account-create-update-w2xct\" (UID: \"9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9\") " pod="openstack/root-account-create-update-w2xct" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.243741 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-scripts\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.243837 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-ring-data-devices\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.243988 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00bfff2e-d59e-4936-b0e1-3476f2d01242-scripts\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.244637 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/00bfff2e-d59e-4936-b0e1-3476f2d01242-ring-data-devices\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.244675 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-etc-swift\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.244758 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/00bfff2e-d59e-4936-b0e1-3476f2d01242-etc-swift\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.248756 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-dispersionconf\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.248820 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-swiftconf\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.249477 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-combined-ca-bundle\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.249766 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-dispersionconf\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.257208 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-swiftconf\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.259253 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-combined-ca-bundle\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.260170 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xzrt\" (UniqueName: \"kubernetes.io/projected/9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9-kube-api-access-5xzrt\") pod \"root-account-create-update-w2xct\" (UID: \"9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9\") " pod="openstack/root-account-create-update-w2xct" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.260432 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbbf2\" (UniqueName: \"kubernetes.io/projected/00bfff2e-d59e-4936-b0e1-3476f2d01242-kube-api-access-nbbf2\") pod \"swift-ring-rebalance-2crm4\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.260779 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6bgz\" (UniqueName: \"kubernetes.io/projected/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-kube-api-access-k6bgz\") pod \"swift-ring-rebalance-qfz9h\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.326509 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.342018 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.348684 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w2xct" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.421052 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.447466 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-swiftconf\") pod \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.447551 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-etc-swift\") pod \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.447604 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-dispersionconf\") pod \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.447658 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-ring-data-devices\") pod \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.447784 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-combined-ca-bundle\") pod \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.447873 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-scripts\") pod \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.447916 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6bgz\" (UniqueName: \"kubernetes.io/projected/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-kube-api-access-k6bgz\") pod \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\" (UID: \"c4a865bb-c4f8-4821-b57a-11ecfa8b3f16\") " Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.447994 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16" (UID: "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.448409 4756 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.448827 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16" (UID: "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.448952 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-scripts" (OuterVolumeSpecName: "scripts") pod "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16" (UID: "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.456859 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16" (UID: "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.456919 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-kube-api-access-k6bgz" (OuterVolumeSpecName: "kube-api-access-k6bgz") pod "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16" (UID: "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16"). InnerVolumeSpecName "kube-api-access-k6bgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.456970 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16" (UID: "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.457215 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16" (UID: "c4a865bb-c4f8-4821-b57a-11ecfa8b3f16"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.549916 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.550282 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.550295 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6bgz\" (UniqueName: \"kubernetes.io/projected/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-kube-api-access-k6bgz\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.550307 4756 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.550319 4756 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:35 crc kubenswrapper[4756]: I0318 14:20:35.550330 4756 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.256637 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2crm4"] Mar 18 14:20:36 crc kubenswrapper[4756]: W0318 14:20:36.264301 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00bfff2e_d59e_4936_b0e1_3476f2d01242.slice/crio-7101c0a5e1051c878055df5e5be754a1b7f83f5ae2dddd5159e8c23e892c7d41 WatchSource:0}: Error finding container 7101c0a5e1051c878055df5e5be754a1b7f83f5ae2dddd5159e8c23e892c7d41: Status 404 returned error can't find the container with id 7101c0a5e1051c878055df5e5be754a1b7f83f5ae2dddd5159e8c23e892c7d41 Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.356215 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vsj9n" event={"ID":"3f5c347c-244f-40b6-8311-8eac0e22626a","Type":"ContainerStarted","Data":"7a427156ae6ad14ac7ab5f65ce48dd89512858251c156013e53f6ddb2b90852f"} Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.356269 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.356287 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.357178 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-w2xct"] Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.358143 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2crm4" event={"ID":"00bfff2e-d59e-4936-b0e1-3476f2d01242","Type":"ContainerStarted","Data":"7101c0a5e1051c878055df5e5be754a1b7f83f5ae2dddd5159e8c23e892c7d41"} Mar 18 14:20:36 crc kubenswrapper[4756]: W0318 14:20:36.358473 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ef1b7d9_ff12_4e18_9fb8_beaca8092ba9.slice/crio-81b0e3f081ce70110b645a563d69ff1026c5a3c5c0f49538cf9c96a54512d488 WatchSource:0}: Error finding container 81b0e3f081ce70110b645a563d69ff1026c5a3c5c0f49538cf9c96a54512d488: Status 404 returned error can't find the container with id 81b0e3f081ce70110b645a563d69ff1026c5a3c5c0f49538cf9c96a54512d488 Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.361257 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d691251-328b-4c09-98d1-4b968ab5bc05","Type":"ContainerStarted","Data":"fca1a509e14bcad5b9c44607728d8f04e898011fd7a5677443fc809dbc712992"} Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.368891 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qfz9h" Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.369014 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"67dc4771-f106-4f33-9f84-0d7251e4259d","Type":"ContainerStarted","Data":"5f570f3352803d95778137ed7834474549a34afc0b73569b324174ff4f99f7b5"} Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.369314 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.385489 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-vsj9n" podStartSLOduration=14.011031496 podStartE2EDuration="43.385459613s" podCreationTimestamp="2026-03-18 14:19:53 +0000 UTC" firstStartedPulling="2026-03-18 14:20:02.422767823 +0000 UTC m=+1203.737185808" lastFinishedPulling="2026-03-18 14:20:31.79719595 +0000 UTC m=+1233.111613925" observedRunningTime="2026-03-18 14:20:36.37834201 +0000 UTC m=+1237.692759985" watchObservedRunningTime="2026-03-18 14:20:36.385459613 +0000 UTC m=+1237.699877588" Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.405365 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.744152126 podStartE2EDuration="47.40534327s" podCreationTimestamp="2026-03-18 14:19:49 +0000 UTC" firstStartedPulling="2026-03-18 14:20:00.235510493 +0000 UTC m=+1201.549928468" lastFinishedPulling="2026-03-18 14:20:35.896701637 +0000 UTC m=+1237.211119612" observedRunningTime="2026-03-18 14:20:36.400699234 +0000 UTC m=+1237.715117229" watchObservedRunningTime="2026-03-18 14:20:36.40534327 +0000 UTC m=+1237.719761255" Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.452615 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-qfz9h"] Mar 18 14:20:36 crc kubenswrapper[4756]: I0318 14:20:36.468135 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-qfz9h"] Mar 18 14:20:37 crc kubenswrapper[4756]: I0318 14:20:37.327760 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4a865bb-c4f8-4821-b57a-11ecfa8b3f16" path="/var/lib/kubelet/pods/c4a865bb-c4f8-4821-b57a-11ecfa8b3f16/volumes" Mar 18 14:20:37 crc kubenswrapper[4756]: I0318 14:20:37.390334 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-w2xct" event={"ID":"9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9","Type":"ContainerStarted","Data":"81b0e3f081ce70110b645a563d69ff1026c5a3c5c0f49538cf9c96a54512d488"} Mar 18 14:20:37 crc kubenswrapper[4756]: I0318 14:20:37.696145 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd" Mar 18 14:20:37 crc kubenswrapper[4756]: I0318 14:20:37.934277 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-pkbmp"] Mar 18 14:20:37 crc kubenswrapper[4756]: I0318 14:20:37.935749 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pkbmp" Mar 18 14:20:37 crc kubenswrapper[4756]: I0318 14:20:37.953533 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pkbmp"] Mar 18 14:20:37 crc kubenswrapper[4756]: I0318 14:20:37.984632 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-sv5g4" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.004067 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmmpq\" (UniqueName: \"kubernetes.io/projected/fb9485ce-3028-491d-8149-55e39b14c5a5-kube-api-access-wmmpq\") pod \"glance-db-create-pkbmp\" (UID: \"fb9485ce-3028-491d-8149-55e39b14c5a5\") " pod="openstack/glance-db-create-pkbmp" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.004189 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9485ce-3028-491d-8149-55e39b14c5a5-operator-scripts\") pod \"glance-db-create-pkbmp\" (UID: \"fb9485ce-3028-491d-8149-55e39b14c5a5\") " pod="openstack/glance-db-create-pkbmp" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.030462 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-4pgf5" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.064274 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-dc6b-account-create-update-9pd9b"] Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.066479 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-dc6b-account-create-update-9pd9b" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.076389 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.090773 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-dc6b-account-create-update-9pd9b"] Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.106554 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmmpq\" (UniqueName: \"kubernetes.io/projected/fb9485ce-3028-491d-8149-55e39b14c5a5-kube-api-access-wmmpq\") pod \"glance-db-create-pkbmp\" (UID: \"fb9485ce-3028-491d-8149-55e39b14c5a5\") " pod="openstack/glance-db-create-pkbmp" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.106640 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9485ce-3028-491d-8149-55e39b14c5a5-operator-scripts\") pod \"glance-db-create-pkbmp\" (UID: \"fb9485ce-3028-491d-8149-55e39b14c5a5\") " pod="openstack/glance-db-create-pkbmp" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.107560 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9485ce-3028-491d-8149-55e39b14c5a5-operator-scripts\") pod \"glance-db-create-pkbmp\" (UID: \"fb9485ce-3028-491d-8149-55e39b14c5a5\") " pod="openstack/glance-db-create-pkbmp" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.208919 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmmpq\" (UniqueName: \"kubernetes.io/projected/fb9485ce-3028-491d-8149-55e39b14c5a5-kube-api-access-wmmpq\") pod \"glance-db-create-pkbmp\" (UID: \"fb9485ce-3028-491d-8149-55e39b14c5a5\") " pod="openstack/glance-db-create-pkbmp" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.210273 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl75n\" (UniqueName: \"kubernetes.io/projected/4ca56e1e-6fbd-4f49-8abf-3f3610879132-kube-api-access-nl75n\") pod \"glance-dc6b-account-create-update-9pd9b\" (UID: \"4ca56e1e-6fbd-4f49-8abf-3f3610879132\") " pod="openstack/glance-dc6b-account-create-update-9pd9b" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.210379 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ca56e1e-6fbd-4f49-8abf-3f3610879132-operator-scripts\") pod \"glance-dc6b-account-create-update-9pd9b\" (UID: \"4ca56e1e-6fbd-4f49-8abf-3f3610879132\") " pod="openstack/glance-dc6b-account-create-update-9pd9b" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.261841 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pkbmp" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.311657 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl75n\" (UniqueName: \"kubernetes.io/projected/4ca56e1e-6fbd-4f49-8abf-3f3610879132-kube-api-access-nl75n\") pod \"glance-dc6b-account-create-update-9pd9b\" (UID: \"4ca56e1e-6fbd-4f49-8abf-3f3610879132\") " pod="openstack/glance-dc6b-account-create-update-9pd9b" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.311744 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ca56e1e-6fbd-4f49-8abf-3f3610879132-operator-scripts\") pod \"glance-dc6b-account-create-update-9pd9b\" (UID: \"4ca56e1e-6fbd-4f49-8abf-3f3610879132\") " pod="openstack/glance-dc6b-account-create-update-9pd9b" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.312956 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ca56e1e-6fbd-4f49-8abf-3f3610879132-operator-scripts\") pod \"glance-dc6b-account-create-update-9pd9b\" (UID: \"4ca56e1e-6fbd-4f49-8abf-3f3610879132\") " pod="openstack/glance-dc6b-account-create-update-9pd9b" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.329603 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl75n\" (UniqueName: \"kubernetes.io/projected/4ca56e1e-6fbd-4f49-8abf-3f3610879132-kube-api-access-nl75n\") pod \"glance-dc6b-account-create-update-9pd9b\" (UID: \"4ca56e1e-6fbd-4f49-8abf-3f3610879132\") " pod="openstack/glance-dc6b-account-create-update-9pd9b" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.388170 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-dc6b-account-create-update-9pd9b" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.407831 4756 generic.go:334] "Generic (PLEG): container finished" podID="9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9" containerID="8b46fe93853f47e87b4dc6bad8803682f8e507f94a42ad20dfe1f0850bf6adf1" exitCode=0 Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.407896 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-w2xct" event={"ID":"9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9","Type":"ContainerDied","Data":"8b46fe93853f47e87b4dc6bad8803682f8e507f94a42ad20dfe1f0850bf6adf1"} Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.764789 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-j7cgx"] Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.766235 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j7cgx" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.773133 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-j7cgx"] Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.821694 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9f2f47-c757-4ed2-8cf0-a89921d6e48b-operator-scripts\") pod \"keystone-db-create-j7cgx\" (UID: \"7f9f2f47-c757-4ed2-8cf0-a89921d6e48b\") " pod="openstack/keystone-db-create-j7cgx" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.821870 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjp4s\" (UniqueName: \"kubernetes.io/projected/7f9f2f47-c757-4ed2-8cf0-a89921d6e48b-kube-api-access-tjp4s\") pod \"keystone-db-create-j7cgx\" (UID: \"7f9f2f47-c757-4ed2-8cf0-a89921d6e48b\") " pod="openstack/keystone-db-create-j7cgx" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.841669 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="ca8fb9b6-a1a9-4781-af8f-2e7e78e62771" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.869130 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-11e9-account-create-update-zj46z"] Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.873007 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-11e9-account-create-update-zj46z" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.874788 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.878549 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-11e9-account-create-update-zj46z"] Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.924029 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9f2f47-c757-4ed2-8cf0-a89921d6e48b-operator-scripts\") pod \"keystone-db-create-j7cgx\" (UID: \"7f9f2f47-c757-4ed2-8cf0-a89921d6e48b\") " pod="openstack/keystone-db-create-j7cgx" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.924794 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9f2f47-c757-4ed2-8cf0-a89921d6e48b-operator-scripts\") pod \"keystone-db-create-j7cgx\" (UID: \"7f9f2f47-c757-4ed2-8cf0-a89921d6e48b\") " pod="openstack/keystone-db-create-j7cgx" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.924918 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjp4s\" (UniqueName: \"kubernetes.io/projected/7f9f2f47-c757-4ed2-8cf0-a89921d6e48b-kube-api-access-tjp4s\") pod \"keystone-db-create-j7cgx\" (UID: \"7f9f2f47-c757-4ed2-8cf0-a89921d6e48b\") " pod="openstack/keystone-db-create-j7cgx" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.927237 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjhwl\" (UniqueName: \"kubernetes.io/projected/f9d20605-425f-4245-8f7a-7d0c95e24e25-kube-api-access-cjhwl\") pod \"keystone-11e9-account-create-update-zj46z\" (UID: \"f9d20605-425f-4245-8f7a-7d0c95e24e25\") " pod="openstack/keystone-11e9-account-create-update-zj46z" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.927351 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9d20605-425f-4245-8f7a-7d0c95e24e25-operator-scripts\") pod \"keystone-11e9-account-create-update-zj46z\" (UID: \"f9d20605-425f-4245-8f7a-7d0c95e24e25\") " pod="openstack/keystone-11e9-account-create-update-zj46z" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.948676 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjp4s\" (UniqueName: \"kubernetes.io/projected/7f9f2f47-c757-4ed2-8cf0-a89921d6e48b-kube-api-access-tjp4s\") pod \"keystone-db-create-j7cgx\" (UID: \"7f9f2f47-c757-4ed2-8cf0-a89921d6e48b\") " pod="openstack/keystone-db-create-j7cgx" Mar 18 14:20:38 crc kubenswrapper[4756]: I0318 14:20:38.982348 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.029487 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.030475 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjhwl\" (UniqueName: \"kubernetes.io/projected/f9d20605-425f-4245-8f7a-7d0c95e24e25-kube-api-access-cjhwl\") pod \"keystone-11e9-account-create-update-zj46z\" (UID: \"f9d20605-425f-4245-8f7a-7d0c95e24e25\") " pod="openstack/keystone-11e9-account-create-update-zj46z" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.030529 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9d20605-425f-4245-8f7a-7d0c95e24e25-operator-scripts\") pod \"keystone-11e9-account-create-update-zj46z\" (UID: \"f9d20605-425f-4245-8f7a-7d0c95e24e25\") " pod="openstack/keystone-11e9-account-create-update-zj46z" Mar 18 14:20:39 crc kubenswrapper[4756]: E0318 14:20:39.029896 4756 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 14:20:39 crc kubenswrapper[4756]: E0318 14:20:39.031369 4756 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.031421 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9d20605-425f-4245-8f7a-7d0c95e24e25-operator-scripts\") pod \"keystone-11e9-account-create-update-zj46z\" (UID: \"f9d20605-425f-4245-8f7a-7d0c95e24e25\") " pod="openstack/keystone-11e9-account-create-update-zj46z" Mar 18 14:20:39 crc kubenswrapper[4756]: E0318 14:20:39.031436 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift podName:717e4d16-f5d1-4367-ad0e-baf820923225 nodeName:}" failed. No retries permitted until 2026-03-18 14:20:47.031411606 +0000 UTC m=+1248.345829641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift") pod "swift-storage-0" (UID: "717e4d16-f5d1-4367-ad0e-baf820923225") : configmap "swift-ring-files" not found Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.044875 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j7cgx" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.057544 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjhwl\" (UniqueName: \"kubernetes.io/projected/f9d20605-425f-4245-8f7a-7d0c95e24e25-kube-api-access-cjhwl\") pod \"keystone-11e9-account-create-update-zj46z\" (UID: \"f9d20605-425f-4245-8f7a-7d0c95e24e25\") " pod="openstack/keystone-11e9-account-create-update-zj46z" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.075232 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6bjq5"] Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.081932 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-11e9-account-create-update-zj46z" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.084724 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6bjq5" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.121170 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d1e7-account-create-update-6b52m"] Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.122522 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d1e7-account-create-update-6b52m" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.124888 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.141390 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6bjq5"] Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.158193 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d1e7-account-create-update-6b52m"] Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.218363 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.236762 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsl4g\" (UniqueName: \"kubernetes.io/projected/7b8c1050-ec08-4c1b-80a0-09461e459598-kube-api-access-xsl4g\") pod \"placement-db-create-6bjq5\" (UID: \"7b8c1050-ec08-4c1b-80a0-09461e459598\") " pod="openstack/placement-db-create-6bjq5" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.236985 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8c1050-ec08-4c1b-80a0-09461e459598-operator-scripts\") pod \"placement-db-create-6bjq5\" (UID: \"7b8c1050-ec08-4c1b-80a0-09461e459598\") " pod="openstack/placement-db-create-6bjq5" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.237056 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c3691b-930b-4cfe-b3c4-b5a5a6244e28-operator-scripts\") pod \"placement-d1e7-account-create-update-6b52m\" (UID: \"c5c3691b-930b-4cfe-b3c4-b5a5a6244e28\") " pod="openstack/placement-d1e7-account-create-update-6b52m" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.237097 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq7cl\" (UniqueName: \"kubernetes.io/projected/c5c3691b-930b-4cfe-b3c4-b5a5a6244e28-kube-api-access-kq7cl\") pod \"placement-d1e7-account-create-update-6b52m\" (UID: \"c5c3691b-930b-4cfe-b3c4-b5a5a6244e28\") " pod="openstack/placement-d1e7-account-create-update-6b52m" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.334477 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pkbmp"] Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.348315 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq7cl\" (UniqueName: \"kubernetes.io/projected/c5c3691b-930b-4cfe-b3c4-b5a5a6244e28-kube-api-access-kq7cl\") pod \"placement-d1e7-account-create-update-6b52m\" (UID: \"c5c3691b-930b-4cfe-b3c4-b5a5a6244e28\") " pod="openstack/placement-d1e7-account-create-update-6b52m" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.348627 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsl4g\" (UniqueName: \"kubernetes.io/projected/7b8c1050-ec08-4c1b-80a0-09461e459598-kube-api-access-xsl4g\") pod \"placement-db-create-6bjq5\" (UID: \"7b8c1050-ec08-4c1b-80a0-09461e459598\") " pod="openstack/placement-db-create-6bjq5" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.348785 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8c1050-ec08-4c1b-80a0-09461e459598-operator-scripts\") pod \"placement-db-create-6bjq5\" (UID: \"7b8c1050-ec08-4c1b-80a0-09461e459598\") " pod="openstack/placement-db-create-6bjq5" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.348932 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c3691b-930b-4cfe-b3c4-b5a5a6244e28-operator-scripts\") pod \"placement-d1e7-account-create-update-6b52m\" (UID: \"c5c3691b-930b-4cfe-b3c4-b5a5a6244e28\") " pod="openstack/placement-d1e7-account-create-update-6b52m" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.349701 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c3691b-930b-4cfe-b3c4-b5a5a6244e28-operator-scripts\") pod \"placement-d1e7-account-create-update-6b52m\" (UID: \"c5c3691b-930b-4cfe-b3c4-b5a5a6244e28\") " pod="openstack/placement-d1e7-account-create-update-6b52m" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.349926 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8c1050-ec08-4c1b-80a0-09461e459598-operator-scripts\") pod \"placement-db-create-6bjq5\" (UID: \"7b8c1050-ec08-4c1b-80a0-09461e459598\") " pod="openstack/placement-db-create-6bjq5" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.365732 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsl4g\" (UniqueName: \"kubernetes.io/projected/7b8c1050-ec08-4c1b-80a0-09461e459598-kube-api-access-xsl4g\") pod \"placement-db-create-6bjq5\" (UID: \"7b8c1050-ec08-4c1b-80a0-09461e459598\") " pod="openstack/placement-db-create-6bjq5" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.366757 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq7cl\" (UniqueName: \"kubernetes.io/projected/c5c3691b-930b-4cfe-b3c4-b5a5a6244e28-kube-api-access-kq7cl\") pod \"placement-d1e7-account-create-update-6b52m\" (UID: \"c5c3691b-930b-4cfe-b3c4-b5a5a6244e28\") " pod="openstack/placement-d1e7-account-create-update-6b52m" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.410192 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6bjq5" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.429883 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-dc6b-account-create-update-9pd9b"] Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.435574 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d691251-328b-4c09-98d1-4b968ab5bc05","Type":"ContainerStarted","Data":"d81a6b6abc89111b1be452ced5400dfe826560faf02cb4c78d822740570d0ae0"} Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.443357 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pkbmp" event={"ID":"fb9485ce-3028-491d-8149-55e39b14c5a5","Type":"ContainerStarted","Data":"1fde882a73a235be6ccc6136249ecf9bcbb86e2f9fa65c40fd778bca78936990"} Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.448779 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"529ae791-8631-4ca5-9e4b-bb857d6264a8","Type":"ContainerStarted","Data":"a9d4358e82450b19fa6cecbdd95a77889c8e2f037f17dc17d5e90da32bd40161"} Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.451409 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-dc6b-account-create-update-9pd9b" event={"ID":"4ca56e1e-6fbd-4f49-8abf-3f3610879132","Type":"ContainerStarted","Data":"181473edd7f192cc017bd88055b62c7c1cdc7051a132bae2487c114270f5fde7"} Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.451781 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d1e7-account-create-update-6b52m" Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.655624 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-j7cgx"] Mar 18 14:20:39 crc kubenswrapper[4756]: I0318 14:20:39.727237 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-11e9-account-create-update-zj46z"] Mar 18 14:20:39 crc kubenswrapper[4756]: W0318 14:20:39.798775 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9d20605_425f_4245_8f7a_7d0c95e24e25.slice/crio-f1a5a26edbfc52db65c936f6aaaedd64776827699874e0c7d55a666a87b76979 WatchSource:0}: Error finding container f1a5a26edbfc52db65c936f6aaaedd64776827699874e0c7d55a666a87b76979: Status 404 returned error can't find the container with id f1a5a26edbfc52db65c936f6aaaedd64776827699874e0c7d55a666a87b76979 Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.090192 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w2xct" Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.179229 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9-operator-scripts\") pod \"9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9\" (UID: \"9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9\") " Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.179390 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xzrt\" (UniqueName: \"kubernetes.io/projected/9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9-kube-api-access-5xzrt\") pod \"9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9\" (UID: \"9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9\") " Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.180322 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9" (UID: "9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.186243 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9-kube-api-access-5xzrt" (OuterVolumeSpecName: "kube-api-access-5xzrt") pod "9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9" (UID: "9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9"). InnerVolumeSpecName "kube-api-access-5xzrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.246152 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d1e7-account-create-update-6b52m"] Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.252859 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6bjq5"] Mar 18 14:20:40 crc kubenswrapper[4756]: W0318 14:20:40.269542 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b8c1050_ec08_4c1b_80a0_09461e459598.slice/crio-0452b24978669657b34c2affbf58d9a037f1bcecf98054bff6900a672ba2fb34 WatchSource:0}: Error finding container 0452b24978669657b34c2affbf58d9a037f1bcecf98054bff6900a672ba2fb34: Status 404 returned error can't find the container with id 0452b24978669657b34c2affbf58d9a037f1bcecf98054bff6900a672ba2fb34 Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.281007 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.281033 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xzrt\" (UniqueName: \"kubernetes.io/projected/9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9-kube-api-access-5xzrt\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.301786 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.360813 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-d59qb"] Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.361660 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" podUID="7c25b083-0273-4f26-8fb7-ca7e78907b8c" containerName="dnsmasq-dns" containerID="cri-o://74fb192257f4a49010e4b73ce8424a4b1a99661af99e75c492cc27e35a2d77ee" gracePeriod=10 Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.469826 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d1e7-account-create-update-6b52m" event={"ID":"c5c3691b-930b-4cfe-b3c4-b5a5a6244e28","Type":"ContainerStarted","Data":"5147e394fc27a265499ef5650826a09644de83670ab24b644a5f787c223c2757"} Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.473388 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6bjq5" event={"ID":"7b8c1050-ec08-4c1b-80a0-09461e459598","Type":"ContainerStarted","Data":"0452b24978669657b34c2affbf58d9a037f1bcecf98054bff6900a672ba2fb34"} Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.476159 4756 generic.go:334] "Generic (PLEG): container finished" podID="7f9f2f47-c757-4ed2-8cf0-a89921d6e48b" containerID="3bd4f5b43753c02ab8a98e8a37967e6af8f6846631774c1152e9a280b7a35bd2" exitCode=0 Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.476228 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j7cgx" event={"ID":"7f9f2f47-c757-4ed2-8cf0-a89921d6e48b","Type":"ContainerDied","Data":"3bd4f5b43753c02ab8a98e8a37967e6af8f6846631774c1152e9a280b7a35bd2"} Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.476244 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j7cgx" event={"ID":"7f9f2f47-c757-4ed2-8cf0-a89921d6e48b","Type":"ContainerStarted","Data":"9e0b0872edba03dee43c1b45d95f03381efa837170fd2ef564d4439041533962"} Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.479490 4756 generic.go:334] "Generic (PLEG): container finished" podID="f9d20605-425f-4245-8f7a-7d0c95e24e25" containerID="d326a16ba2ccdc49f389d48265ccc4a917abf90377c058387addb24c132684bc" exitCode=0 Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.479633 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-11e9-account-create-update-zj46z" event={"ID":"f9d20605-425f-4245-8f7a-7d0c95e24e25","Type":"ContainerDied","Data":"d326a16ba2ccdc49f389d48265ccc4a917abf90377c058387addb24c132684bc"} Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.479717 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-11e9-account-create-update-zj46z" event={"ID":"f9d20605-425f-4245-8f7a-7d0c95e24e25","Type":"ContainerStarted","Data":"f1a5a26edbfc52db65c936f6aaaedd64776827699874e0c7d55a666a87b76979"} Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.481643 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-w2xct" event={"ID":"9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9","Type":"ContainerDied","Data":"81b0e3f081ce70110b645a563d69ff1026c5a3c5c0f49538cf9c96a54512d488"} Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.481693 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81b0e3f081ce70110b645a563d69ff1026c5a3c5c0f49538cf9c96a54512d488" Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.481652 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w2xct" Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.490489 4756 generic.go:334] "Generic (PLEG): container finished" podID="fb9485ce-3028-491d-8149-55e39b14c5a5" containerID="01ec4c5abb00ab9fa9100e70ad4ed31a2eeb7b94c0d9d5245285ad66338ffee0" exitCode=0 Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.490606 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pkbmp" event={"ID":"fb9485ce-3028-491d-8149-55e39b14c5a5","Type":"ContainerDied","Data":"01ec4c5abb00ab9fa9100e70ad4ed31a2eeb7b94c0d9d5245285ad66338ffee0"} Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.498022 4756 generic.go:334] "Generic (PLEG): container finished" podID="4ca56e1e-6fbd-4f49-8abf-3f3610879132" containerID="958cf6673af65aee881a8f448643d206ce25761d6e903669449e3138065f1244" exitCode=0 Mar 18 14:20:40 crc kubenswrapper[4756]: I0318 14:20:40.498077 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-dc6b-account-create-update-9pd9b" event={"ID":"4ca56e1e-6fbd-4f49-8abf-3f3610879132","Type":"ContainerDied","Data":"958cf6673af65aee881a8f448643d206ce25761d6e903669449e3138065f1244"} Mar 18 14:20:41 crc kubenswrapper[4756]: I0318 14:20:41.510713 4756 generic.go:334] "Generic (PLEG): container finished" podID="c5c3691b-930b-4cfe-b3c4-b5a5a6244e28" containerID="7d138f83b8a277738e896134149968b7272f36905cd61120046eab4a3b6d7244" exitCode=0 Mar 18 14:20:41 crc kubenswrapper[4756]: I0318 14:20:41.511431 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d1e7-account-create-update-6b52m" event={"ID":"c5c3691b-930b-4cfe-b3c4-b5a5a6244e28","Type":"ContainerDied","Data":"7d138f83b8a277738e896134149968b7272f36905cd61120046eab4a3b6d7244"} Mar 18 14:20:41 crc kubenswrapper[4756]: I0318 14:20:41.514439 4756 generic.go:334] "Generic (PLEG): container finished" podID="7b8c1050-ec08-4c1b-80a0-09461e459598" containerID="68c334552cdf9d26b95d33d9b9928951ca2afe018c7d8befac930d53be5506f4" exitCode=0 Mar 18 14:20:41 crc kubenswrapper[4756]: I0318 14:20:41.514517 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6bjq5" event={"ID":"7b8c1050-ec08-4c1b-80a0-09461e459598","Type":"ContainerDied","Data":"68c334552cdf9d26b95d33d9b9928951ca2afe018c7d8befac930d53be5506f4"} Mar 18 14:20:41 crc kubenswrapper[4756]: I0318 14:20:41.517248 4756 generic.go:334] "Generic (PLEG): container finished" podID="7c25b083-0273-4f26-8fb7-ca7e78907b8c" containerID="74fb192257f4a49010e4b73ce8424a4b1a99661af99e75c492cc27e35a2d77ee" exitCode=0 Mar 18 14:20:41 crc kubenswrapper[4756]: I0318 14:20:41.517373 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" event={"ID":"7c25b083-0273-4f26-8fb7-ca7e78907b8c","Type":"ContainerDied","Data":"74fb192257f4a49010e4b73ce8424a4b1a99661af99e75c492cc27e35a2d77ee"} Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.212346 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pkbmp" Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.353319 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmmpq\" (UniqueName: \"kubernetes.io/projected/fb9485ce-3028-491d-8149-55e39b14c5a5-kube-api-access-wmmpq\") pod \"fb9485ce-3028-491d-8149-55e39b14c5a5\" (UID: \"fb9485ce-3028-491d-8149-55e39b14c5a5\") " Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.353450 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9485ce-3028-491d-8149-55e39b14c5a5-operator-scripts\") pod \"fb9485ce-3028-491d-8149-55e39b14c5a5\" (UID: \"fb9485ce-3028-491d-8149-55e39b14c5a5\") " Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.354585 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb9485ce-3028-491d-8149-55e39b14c5a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb9485ce-3028-491d-8149-55e39b14c5a5" (UID: "fb9485ce-3028-491d-8149-55e39b14c5a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.364254 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb9485ce-3028-491d-8149-55e39b14c5a5-kube-api-access-wmmpq" (OuterVolumeSpecName: "kube-api-access-wmmpq") pod "fb9485ce-3028-491d-8149-55e39b14c5a5" (UID: "fb9485ce-3028-491d-8149-55e39b14c5a5"). InnerVolumeSpecName "kube-api-access-wmmpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.461047 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmmpq\" (UniqueName: \"kubernetes.io/projected/fb9485ce-3028-491d-8149-55e39b14c5a5-kube-api-access-wmmpq\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.461106 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9485ce-3028-491d-8149-55e39b14c5a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.548353 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pkbmp" Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.548982 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pkbmp" event={"ID":"fb9485ce-3028-491d-8149-55e39b14c5a5","Type":"ContainerDied","Data":"1fde882a73a235be6ccc6136249ecf9bcbb86e2f9fa65c40fd778bca78936990"} Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.549024 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fde882a73a235be6ccc6136249ecf9bcbb86e2f9fa65c40fd778bca78936990" Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.552641 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"529ae791-8631-4ca5-9e4b-bb857d6264a8","Type":"ContainerStarted","Data":"c9ecd53fd3da7584227fd7ad3025e149c5a08d4c7b03378ba1b72a01a0f0d4ec"} Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.553165 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.562522 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 18 14:20:43 crc kubenswrapper[4756]: I0318 14:20:43.575386 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=15.734588153 podStartE2EDuration="53.575368321s" podCreationTimestamp="2026-03-18 14:19:50 +0000 UTC" firstStartedPulling="2026-03-18 14:20:00.991939316 +0000 UTC m=+1202.306357291" lastFinishedPulling="2026-03-18 14:20:38.832719484 +0000 UTC m=+1240.147137459" observedRunningTime="2026-03-18 14:20:43.573786739 +0000 UTC m=+1244.888204734" watchObservedRunningTime="2026-03-18 14:20:43.575368321 +0000 UTC m=+1244.889786296" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.495083 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d1e7-account-create-update-6b52m" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.537774 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-11e9-account-create-update-zj46z" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.583886 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq7cl\" (UniqueName: \"kubernetes.io/projected/c5c3691b-930b-4cfe-b3c4-b5a5a6244e28-kube-api-access-kq7cl\") pod \"c5c3691b-930b-4cfe-b3c4-b5a5a6244e28\" (UID: \"c5c3691b-930b-4cfe-b3c4-b5a5a6244e28\") " Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.583962 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c3691b-930b-4cfe-b3c4-b5a5a6244e28-operator-scripts\") pod \"c5c3691b-930b-4cfe-b3c4-b5a5a6244e28\" (UID: \"c5c3691b-930b-4cfe-b3c4-b5a5a6244e28\") " Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.584030 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9d20605-425f-4245-8f7a-7d0c95e24e25-operator-scripts\") pod \"f9d20605-425f-4245-8f7a-7d0c95e24e25\" (UID: \"f9d20605-425f-4245-8f7a-7d0c95e24e25\") " Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.584195 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjhwl\" (UniqueName: \"kubernetes.io/projected/f9d20605-425f-4245-8f7a-7d0c95e24e25-kube-api-access-cjhwl\") pod \"f9d20605-425f-4245-8f7a-7d0c95e24e25\" (UID: \"f9d20605-425f-4245-8f7a-7d0c95e24e25\") " Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.584980 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5c3691b-930b-4cfe-b3c4-b5a5a6244e28-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5c3691b-930b-4cfe-b3c4-b5a5a6244e28" (UID: "c5c3691b-930b-4cfe-b3c4-b5a5a6244e28"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.585732 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9d20605-425f-4245-8f7a-7d0c95e24e25-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9d20605-425f-4245-8f7a-7d0c95e24e25" (UID: "f9d20605-425f-4245-8f7a-7d0c95e24e25"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.588511 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c3691b-930b-4cfe-b3c4-b5a5a6244e28-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.588561 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9d20605-425f-4245-8f7a-7d0c95e24e25-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.597451 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-dc6b-account-create-update-9pd9b" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.598719 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6bjq5" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.598731 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j7cgx" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.599325 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" event={"ID":"7c25b083-0273-4f26-8fb7-ca7e78907b8c","Type":"ContainerDied","Data":"4d0cf8e5e185af36f8a31d1ac75f63ea2c6c93bcb811b50655b03d1d80720560"} Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.599397 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d0cf8e5e185af36f8a31d1ac75f63ea2c6c93bcb811b50655b03d1d80720560" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.601819 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9d20605-425f-4245-8f7a-7d0c95e24e25-kube-api-access-cjhwl" (OuterVolumeSpecName: "kube-api-access-cjhwl") pod "f9d20605-425f-4245-8f7a-7d0c95e24e25" (UID: "f9d20605-425f-4245-8f7a-7d0c95e24e25"). InnerVolumeSpecName "kube-api-access-cjhwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.602202 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-dc6b-account-create-update-9pd9b" event={"ID":"4ca56e1e-6fbd-4f49-8abf-3f3610879132","Type":"ContainerDied","Data":"181473edd7f192cc017bd88055b62c7c1cdc7051a132bae2487c114270f5fde7"} Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.602317 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="181473edd7f192cc017bd88055b62c7c1cdc7051a132bae2487c114270f5fde7" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.602811 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c3691b-930b-4cfe-b3c4-b5a5a6244e28-kube-api-access-kq7cl" (OuterVolumeSpecName: "kube-api-access-kq7cl") pod "c5c3691b-930b-4cfe-b3c4-b5a5a6244e28" (UID: "c5c3691b-930b-4cfe-b3c4-b5a5a6244e28"). InnerVolumeSpecName "kube-api-access-kq7cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.602945 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-dc6b-account-create-update-9pd9b" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.606787 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d1e7-account-create-update-6b52m" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.606791 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d1e7-account-create-update-6b52m" event={"ID":"c5c3691b-930b-4cfe-b3c4-b5a5a6244e28","Type":"ContainerDied","Data":"5147e394fc27a265499ef5650826a09644de83670ab24b644a5f787c223c2757"} Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.607060 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5147e394fc27a265499ef5650826a09644de83670ab24b644a5f787c223c2757" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.611645 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6bjq5" event={"ID":"7b8c1050-ec08-4c1b-80a0-09461e459598","Type":"ContainerDied","Data":"0452b24978669657b34c2affbf58d9a037f1bcecf98054bff6900a672ba2fb34"} Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.611700 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0452b24978669657b34c2affbf58d9a037f1bcecf98054bff6900a672ba2fb34" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.611769 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6bjq5" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.619172 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j7cgx" event={"ID":"7f9f2f47-c757-4ed2-8cf0-a89921d6e48b","Type":"ContainerDied","Data":"9e0b0872edba03dee43c1b45d95f03381efa837170fd2ef564d4439041533962"} Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.619206 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e0b0872edba03dee43c1b45d95f03381efa837170fd2ef564d4439041533962" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.619260 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j7cgx" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.621951 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-11e9-account-create-update-zj46z" event={"ID":"f9d20605-425f-4245-8f7a-7d0c95e24e25","Type":"ContainerDied","Data":"f1a5a26edbfc52db65c936f6aaaedd64776827699874e0c7d55a666a87b76979"} Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.622026 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1a5a26edbfc52db65c936f6aaaedd64776827699874e0c7d55a666a87b76979" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.622177 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.622282 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-11e9-account-create-update-zj46z" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.689307 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl75n\" (UniqueName: \"kubernetes.io/projected/4ca56e1e-6fbd-4f49-8abf-3f3610879132-kube-api-access-nl75n\") pod \"4ca56e1e-6fbd-4f49-8abf-3f3610879132\" (UID: \"4ca56e1e-6fbd-4f49-8abf-3f3610879132\") " Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.689417 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c25b083-0273-4f26-8fb7-ca7e78907b8c-dns-svc\") pod \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\" (UID: \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\") " Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.689446 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9f2f47-c757-4ed2-8cf0-a89921d6e48b-operator-scripts\") pod \"7f9f2f47-c757-4ed2-8cf0-a89921d6e48b\" (UID: \"7f9f2f47-c757-4ed2-8cf0-a89921d6e48b\") " Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.689487 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c25b083-0273-4f26-8fb7-ca7e78907b8c-config\") pod \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\" (UID: \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\") " Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.689526 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8c1050-ec08-4c1b-80a0-09461e459598-operator-scripts\") pod \"7b8c1050-ec08-4c1b-80a0-09461e459598\" (UID: \"7b8c1050-ec08-4c1b-80a0-09461e459598\") " Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.689554 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lrqx\" (UniqueName: \"kubernetes.io/projected/7c25b083-0273-4f26-8fb7-ca7e78907b8c-kube-api-access-4lrqx\") pod \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\" (UID: \"7c25b083-0273-4f26-8fb7-ca7e78907b8c\") " Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.689584 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsl4g\" (UniqueName: \"kubernetes.io/projected/7b8c1050-ec08-4c1b-80a0-09461e459598-kube-api-access-xsl4g\") pod \"7b8c1050-ec08-4c1b-80a0-09461e459598\" (UID: \"7b8c1050-ec08-4c1b-80a0-09461e459598\") " Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.689603 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjp4s\" (UniqueName: \"kubernetes.io/projected/7f9f2f47-c757-4ed2-8cf0-a89921d6e48b-kube-api-access-tjp4s\") pod \"7f9f2f47-c757-4ed2-8cf0-a89921d6e48b\" (UID: \"7f9f2f47-c757-4ed2-8cf0-a89921d6e48b\") " Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.689633 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ca56e1e-6fbd-4f49-8abf-3f3610879132-operator-scripts\") pod \"4ca56e1e-6fbd-4f49-8abf-3f3610879132\" (UID: \"4ca56e1e-6fbd-4f49-8abf-3f3610879132\") " Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.690067 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq7cl\" (UniqueName: \"kubernetes.io/projected/c5c3691b-930b-4cfe-b3c4-b5a5a6244e28-kube-api-access-kq7cl\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.690078 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjhwl\" (UniqueName: \"kubernetes.io/projected/f9d20605-425f-4245-8f7a-7d0c95e24e25-kube-api-access-cjhwl\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.690102 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f9f2f47-c757-4ed2-8cf0-a89921d6e48b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f9f2f47-c757-4ed2-8cf0-a89921d6e48b" (UID: "7f9f2f47-c757-4ed2-8cf0-a89921d6e48b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.690755 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca56e1e-6fbd-4f49-8abf-3f3610879132-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ca56e1e-6fbd-4f49-8abf-3f3610879132" (UID: "4ca56e1e-6fbd-4f49-8abf-3f3610879132"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.691393 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8c1050-ec08-4c1b-80a0-09461e459598-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b8c1050-ec08-4c1b-80a0-09461e459598" (UID: "7b8c1050-ec08-4c1b-80a0-09461e459598"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.693954 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca56e1e-6fbd-4f49-8abf-3f3610879132-kube-api-access-nl75n" (OuterVolumeSpecName: "kube-api-access-nl75n") pod "4ca56e1e-6fbd-4f49-8abf-3f3610879132" (UID: "4ca56e1e-6fbd-4f49-8abf-3f3610879132"). InnerVolumeSpecName "kube-api-access-nl75n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.695066 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b8c1050-ec08-4c1b-80a0-09461e459598-kube-api-access-xsl4g" (OuterVolumeSpecName: "kube-api-access-xsl4g") pod "7b8c1050-ec08-4c1b-80a0-09461e459598" (UID: "7b8c1050-ec08-4c1b-80a0-09461e459598"). InnerVolumeSpecName "kube-api-access-xsl4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.696252 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f9f2f47-c757-4ed2-8cf0-a89921d6e48b-kube-api-access-tjp4s" (OuterVolumeSpecName: "kube-api-access-tjp4s") pod "7f9f2f47-c757-4ed2-8cf0-a89921d6e48b" (UID: "7f9f2f47-c757-4ed2-8cf0-a89921d6e48b"). InnerVolumeSpecName "kube-api-access-tjp4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.698674 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c25b083-0273-4f26-8fb7-ca7e78907b8c-kube-api-access-4lrqx" (OuterVolumeSpecName: "kube-api-access-4lrqx") pod "7c25b083-0273-4f26-8fb7-ca7e78907b8c" (UID: "7c25b083-0273-4f26-8fb7-ca7e78907b8c"). InnerVolumeSpecName "kube-api-access-4lrqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.732728 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c25b083-0273-4f26-8fb7-ca7e78907b8c-config" (OuterVolumeSpecName: "config") pod "7c25b083-0273-4f26-8fb7-ca7e78907b8c" (UID: "7c25b083-0273-4f26-8fb7-ca7e78907b8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.737595 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c25b083-0273-4f26-8fb7-ca7e78907b8c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7c25b083-0273-4f26-8fb7-ca7e78907b8c" (UID: "7c25b083-0273-4f26-8fb7-ca7e78907b8c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.791311 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl75n\" (UniqueName: \"kubernetes.io/projected/4ca56e1e-6fbd-4f49-8abf-3f3610879132-kube-api-access-nl75n\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.791344 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c25b083-0273-4f26-8fb7-ca7e78907b8c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.791354 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9f2f47-c757-4ed2-8cf0-a89921d6e48b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.791362 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c25b083-0273-4f26-8fb7-ca7e78907b8c-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.791373 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8c1050-ec08-4c1b-80a0-09461e459598-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.791384 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lrqx\" (UniqueName: \"kubernetes.io/projected/7c25b083-0273-4f26-8fb7-ca7e78907b8c-kube-api-access-4lrqx\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.791394 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsl4g\" (UniqueName: \"kubernetes.io/projected/7b8c1050-ec08-4c1b-80a0-09461e459598-kube-api-access-xsl4g\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.791402 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjp4s\" (UniqueName: \"kubernetes.io/projected/7f9f2f47-c757-4ed2-8cf0-a89921d6e48b-kube-api-access-tjp4s\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:44 crc kubenswrapper[4756]: I0318 14:20:44.791410 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ca56e1e-6fbd-4f49-8abf-3f3610879132-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:45 crc kubenswrapper[4756]: I0318 14:20:45.635532 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2crm4" event={"ID":"00bfff2e-d59e-4936-b0e1-3476f2d01242","Type":"ContainerStarted","Data":"3f689bfd6b6a03108086b2093cec06678f2399f6ae76eadc7090f9503428508e"} Mar 18 14:20:45 crc kubenswrapper[4756]: I0318 14:20:45.642903 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" Mar 18 14:20:45 crc kubenswrapper[4756]: I0318 14:20:45.643723 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d691251-328b-4c09-98d1-4b968ab5bc05","Type":"ContainerStarted","Data":"3034b4753cd3f1e18ef15ad501e7b2623f47aee37a1bb802c2840c6d9a89c495"} Mar 18 14:20:45 crc kubenswrapper[4756]: I0318 14:20:45.661899 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2crm4" podStartSLOduration=2.6533001929999998 podStartE2EDuration="10.661879348s" podCreationTimestamp="2026-03-18 14:20:35 +0000 UTC" firstStartedPulling="2026-03-18 14:20:36.267402101 +0000 UTC m=+1237.581820076" lastFinishedPulling="2026-03-18 14:20:44.275981236 +0000 UTC m=+1245.590399231" observedRunningTime="2026-03-18 14:20:45.657143011 +0000 UTC m=+1246.971560996" watchObservedRunningTime="2026-03-18 14:20:45.661879348 +0000 UTC m=+1246.976297313" Mar 18 14:20:45 crc kubenswrapper[4756]: I0318 14:20:45.699589 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=12.405337333 podStartE2EDuration="55.699563228s" podCreationTimestamp="2026-03-18 14:19:50 +0000 UTC" firstStartedPulling="2026-03-18 14:20:00.993365774 +0000 UTC m=+1202.307783749" lastFinishedPulling="2026-03-18 14:20:44.287591629 +0000 UTC m=+1245.602009644" observedRunningTime="2026-03-18 14:20:45.692350142 +0000 UTC m=+1247.006768117" watchObservedRunningTime="2026-03-18 14:20:45.699563228 +0000 UTC m=+1247.013981213" Mar 18 14:20:45 crc kubenswrapper[4756]: I0318 14:20:45.711775 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-d59qb"] Mar 18 14:20:45 crc kubenswrapper[4756]: I0318 14:20:45.726380 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-d59qb"] Mar 18 14:20:46 crc kubenswrapper[4756]: I0318 14:20:46.135821 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-w2xct"] Mar 18 14:20:46 crc kubenswrapper[4756]: I0318 14:20:46.152267 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-w2xct"] Mar 18 14:20:46 crc kubenswrapper[4756]: I0318 14:20:46.392241 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 18 14:20:47 crc kubenswrapper[4756]: I0318 14:20:47.036461 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:20:47 crc kubenswrapper[4756]: E0318 14:20:47.036647 4756 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 14:20:47 crc kubenswrapper[4756]: E0318 14:20:47.036955 4756 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 14:20:47 crc kubenswrapper[4756]: E0318 14:20:47.037042 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift podName:717e4d16-f5d1-4367-ad0e-baf820923225 nodeName:}" failed. No retries permitted until 2026-03-18 14:21:03.037013861 +0000 UTC m=+1264.351431856 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift") pod "swift-storage-0" (UID: "717e4d16-f5d1-4367-ad0e-baf820923225") : configmap "swift-ring-files" not found Mar 18 14:20:47 crc kubenswrapper[4756]: I0318 14:20:47.331728 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c25b083-0273-4f26-8fb7-ca7e78907b8c" path="/var/lib/kubelet/pods/7c25b083-0273-4f26-8fb7-ca7e78907b8c/volumes" Mar 18 14:20:47 crc kubenswrapper[4756]: I0318 14:20:47.333692 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9" path="/var/lib/kubelet/pods/9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9/volumes" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.267459 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-g229b"] Mar 18 14:20:48 crc kubenswrapper[4756]: E0318 14:20:48.268283 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c25b083-0273-4f26-8fb7-ca7e78907b8c" containerName="dnsmasq-dns" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.268344 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c25b083-0273-4f26-8fb7-ca7e78907b8c" containerName="dnsmasq-dns" Mar 18 14:20:48 crc kubenswrapper[4756]: E0318 14:20:48.268409 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca56e1e-6fbd-4f49-8abf-3f3610879132" containerName="mariadb-account-create-update" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.268461 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca56e1e-6fbd-4f49-8abf-3f3610879132" containerName="mariadb-account-create-update" Mar 18 14:20:48 crc kubenswrapper[4756]: E0318 14:20:48.268518 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9f2f47-c757-4ed2-8cf0-a89921d6e48b" containerName="mariadb-database-create" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.268565 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9f2f47-c757-4ed2-8cf0-a89921d6e48b" containerName="mariadb-database-create" Mar 18 14:20:48 crc kubenswrapper[4756]: E0318 14:20:48.268633 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c3691b-930b-4cfe-b3c4-b5a5a6244e28" containerName="mariadb-account-create-update" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.268684 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c3691b-930b-4cfe-b3c4-b5a5a6244e28" containerName="mariadb-account-create-update" Mar 18 14:20:48 crc kubenswrapper[4756]: E0318 14:20:48.268736 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9" containerName="mariadb-account-create-update" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.268783 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9" containerName="mariadb-account-create-update" Mar 18 14:20:48 crc kubenswrapper[4756]: E0318 14:20:48.268839 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9485ce-3028-491d-8149-55e39b14c5a5" containerName="mariadb-database-create" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.268886 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9485ce-3028-491d-8149-55e39b14c5a5" containerName="mariadb-database-create" Mar 18 14:20:48 crc kubenswrapper[4756]: E0318 14:20:48.268939 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c25b083-0273-4f26-8fb7-ca7e78907b8c" containerName="init" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.268988 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c25b083-0273-4f26-8fb7-ca7e78907b8c" containerName="init" Mar 18 14:20:48 crc kubenswrapper[4756]: E0318 14:20:48.269037 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d20605-425f-4245-8f7a-7d0c95e24e25" containerName="mariadb-account-create-update" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.269085 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d20605-425f-4245-8f7a-7d0c95e24e25" containerName="mariadb-account-create-update" Mar 18 14:20:48 crc kubenswrapper[4756]: E0318 14:20:48.269160 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8c1050-ec08-4c1b-80a0-09461e459598" containerName="mariadb-database-create" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.269210 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8c1050-ec08-4c1b-80a0-09461e459598" containerName="mariadb-database-create" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.269410 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c3691b-930b-4cfe-b3c4-b5a5a6244e28" containerName="mariadb-account-create-update" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.269485 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9d20605-425f-4245-8f7a-7d0c95e24e25" containerName="mariadb-account-create-update" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.269543 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9f2f47-c757-4ed2-8cf0-a89921d6e48b" containerName="mariadb-database-create" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.269598 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ca56e1e-6fbd-4f49-8abf-3f3610879132" containerName="mariadb-account-create-update" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.269656 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c25b083-0273-4f26-8fb7-ca7e78907b8c" containerName="dnsmasq-dns" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.269713 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb9485ce-3028-491d-8149-55e39b14c5a5" containerName="mariadb-database-create" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.269780 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b8c1050-ec08-4c1b-80a0-09461e459598" containerName="mariadb-database-create" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.269842 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef1b7d9-ff12-4e18-9fb8-beaca8092ba9" containerName="mariadb-account-create-update" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.270504 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.272701 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rwf89" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.274603 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.278391 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-g229b"] Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.368771 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-config-data\") pod \"glance-db-sync-g229b\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.368906 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-combined-ca-bundle\") pod \"glance-db-sync-g229b\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.368949 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjfwp\" (UniqueName: \"kubernetes.io/projected/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-kube-api-access-xjfwp\") pod \"glance-db-sync-g229b\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.369002 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-db-sync-config-data\") pod \"glance-db-sync-g229b\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.471048 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-config-data\") pod \"glance-db-sync-g229b\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.471109 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-combined-ca-bundle\") pod \"glance-db-sync-g229b\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.471157 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-db-sync-config-data\") pod \"glance-db-sync-g229b\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.471178 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjfwp\" (UniqueName: \"kubernetes.io/projected/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-kube-api-access-xjfwp\") pod \"glance-db-sync-g229b\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.483844 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-config-data\") pod \"glance-db-sync-g229b\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.483874 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-db-sync-config-data\") pod \"glance-db-sync-g229b\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.493951 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjfwp\" (UniqueName: \"kubernetes.io/projected/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-kube-api-access-xjfwp\") pod \"glance-db-sync-g229b\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.494249 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-combined-ca-bundle\") pod \"glance-db-sync-g229b\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.585413 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-g229b" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.797318 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-d59qb" podUID="7c25b083-0273-4f26-8fb7-ca7e78907b8c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.108:5353: i/o timeout" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.841965 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="ca8fb9b6-a1a9-4781-af8f-2e7e78e62771" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 14:20:48 crc kubenswrapper[4756]: I0318 14:20:48.898843 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 14:20:49 crc kubenswrapper[4756]: I0318 14:20:49.534295 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-g229b"] Mar 18 14:20:49 crc kubenswrapper[4756]: I0318 14:20:49.684523 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-g229b" event={"ID":"a6ee9241-87cd-43e5-90d2-869e14cc1eb6","Type":"ContainerStarted","Data":"6c1227b0ece1cca7a7acbd0ccd1734a9b7b15c618c99954f89ba014c89bfcf33"} Mar 18 14:20:49 crc kubenswrapper[4756]: I0318 14:20:49.951018 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.117581 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7bbnt"] Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.119356 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7bbnt" Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.125013 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.128443 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7bbnt"] Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.221398 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wrwg\" (UniqueName: \"kubernetes.io/projected/5dc602be-eea4-4a22-bfba-1f6319b32064-kube-api-access-2wrwg\") pod \"root-account-create-update-7bbnt\" (UID: \"5dc602be-eea4-4a22-bfba-1f6319b32064\") " pod="openstack/root-account-create-update-7bbnt" Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.221661 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dc602be-eea4-4a22-bfba-1f6319b32064-operator-scripts\") pod \"root-account-create-update-7bbnt\" (UID: \"5dc602be-eea4-4a22-bfba-1f6319b32064\") " pod="openstack/root-account-create-update-7bbnt" Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.323424 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dc602be-eea4-4a22-bfba-1f6319b32064-operator-scripts\") pod \"root-account-create-update-7bbnt\" (UID: \"5dc602be-eea4-4a22-bfba-1f6319b32064\") " pod="openstack/root-account-create-update-7bbnt" Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.323529 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wrwg\" (UniqueName: \"kubernetes.io/projected/5dc602be-eea4-4a22-bfba-1f6319b32064-kube-api-access-2wrwg\") pod \"root-account-create-update-7bbnt\" (UID: \"5dc602be-eea4-4a22-bfba-1f6319b32064\") " pod="openstack/root-account-create-update-7bbnt" Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.324856 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dc602be-eea4-4a22-bfba-1f6319b32064-operator-scripts\") pod \"root-account-create-update-7bbnt\" (UID: \"5dc602be-eea4-4a22-bfba-1f6319b32064\") " pod="openstack/root-account-create-update-7bbnt" Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.346480 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wrwg\" (UniqueName: \"kubernetes.io/projected/5dc602be-eea4-4a22-bfba-1f6319b32064-kube-api-access-2wrwg\") pod \"root-account-create-update-7bbnt\" (UID: \"5dc602be-eea4-4a22-bfba-1f6319b32064\") " pod="openstack/root-account-create-update-7bbnt" Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.391685 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.397225 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.453686 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7bbnt" Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.705312 4756 generic.go:334] "Generic (PLEG): container finished" podID="00bfff2e-d59e-4936-b0e1-3476f2d01242" containerID="3f689bfd6b6a03108086b2093cec06678f2399f6ae76eadc7090f9503428508e" exitCode=0 Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.705445 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2crm4" event={"ID":"00bfff2e-d59e-4936-b0e1-3476f2d01242","Type":"ContainerDied","Data":"3f689bfd6b6a03108086b2093cec06678f2399f6ae76eadc7090f9503428508e"} Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.708507 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 18 14:20:51 crc kubenswrapper[4756]: I0318 14:20:51.931534 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7bbnt"] Mar 18 14:20:51 crc kubenswrapper[4756]: W0318 14:20:51.942793 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc602be_eea4_4a22_bfba_1f6319b32064.slice/crio-3c8aaca80cbba282ef0cc2e1c4c048a33fb983709a22388ef6dc34181f3c2c5f WatchSource:0}: Error finding container 3c8aaca80cbba282ef0cc2e1c4c048a33fb983709a22388ef6dc34181f3c2c5f: Status 404 returned error can't find the container with id 3c8aaca80cbba282ef0cc2e1c4c048a33fb983709a22388ef6dc34181f3c2c5f Mar 18 14:20:52 crc kubenswrapper[4756]: I0318 14:20:52.717525 4756 generic.go:334] "Generic (PLEG): container finished" podID="5dc602be-eea4-4a22-bfba-1f6319b32064" containerID="33fe0a061152e5b59c3f8a7bbaa77565e496e5d2f5d13ee7143400ef8cacc2e7" exitCode=0 Mar 18 14:20:52 crc kubenswrapper[4756]: I0318 14:20:52.717652 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7bbnt" event={"ID":"5dc602be-eea4-4a22-bfba-1f6319b32064","Type":"ContainerDied","Data":"33fe0a061152e5b59c3f8a7bbaa77565e496e5d2f5d13ee7143400ef8cacc2e7"} Mar 18 14:20:52 crc kubenswrapper[4756]: I0318 14:20:52.717943 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7bbnt" event={"ID":"5dc602be-eea4-4a22-bfba-1f6319b32064","Type":"ContainerStarted","Data":"3c8aaca80cbba282ef0cc2e1c4c048a33fb983709a22388ef6dc34181f3c2c5f"} Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.074266 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.184815 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-dispersionconf\") pod \"00bfff2e-d59e-4936-b0e1-3476f2d01242\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.184900 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/00bfff2e-d59e-4936-b0e1-3476f2d01242-etc-swift\") pod \"00bfff2e-d59e-4936-b0e1-3476f2d01242\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.184947 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-swiftconf\") pod \"00bfff2e-d59e-4936-b0e1-3476f2d01242\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.184995 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-combined-ca-bundle\") pod \"00bfff2e-d59e-4936-b0e1-3476f2d01242\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.185047 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/00bfff2e-d59e-4936-b0e1-3476f2d01242-ring-data-devices\") pod \"00bfff2e-d59e-4936-b0e1-3476f2d01242\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.185093 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00bfff2e-d59e-4936-b0e1-3476f2d01242-scripts\") pod \"00bfff2e-d59e-4936-b0e1-3476f2d01242\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.185148 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbbf2\" (UniqueName: \"kubernetes.io/projected/00bfff2e-d59e-4936-b0e1-3476f2d01242-kube-api-access-nbbf2\") pod \"00bfff2e-d59e-4936-b0e1-3476f2d01242\" (UID: \"00bfff2e-d59e-4936-b0e1-3476f2d01242\") " Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.185653 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00bfff2e-d59e-4936-b0e1-3476f2d01242-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "00bfff2e-d59e-4936-b0e1-3476f2d01242" (UID: "00bfff2e-d59e-4936-b0e1-3476f2d01242"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.185877 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00bfff2e-d59e-4936-b0e1-3476f2d01242-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "00bfff2e-d59e-4936-b0e1-3476f2d01242" (UID: "00bfff2e-d59e-4936-b0e1-3476f2d01242"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.206846 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00bfff2e-d59e-4936-b0e1-3476f2d01242-kube-api-access-nbbf2" (OuterVolumeSpecName: "kube-api-access-nbbf2") pod "00bfff2e-d59e-4936-b0e1-3476f2d01242" (UID: "00bfff2e-d59e-4936-b0e1-3476f2d01242"). InnerVolumeSpecName "kube-api-access-nbbf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.207699 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00bfff2e-d59e-4936-b0e1-3476f2d01242-scripts" (OuterVolumeSpecName: "scripts") pod "00bfff2e-d59e-4936-b0e1-3476f2d01242" (UID: "00bfff2e-d59e-4936-b0e1-3476f2d01242"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.223527 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "00bfff2e-d59e-4936-b0e1-3476f2d01242" (UID: "00bfff2e-d59e-4936-b0e1-3476f2d01242"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.234392 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "00bfff2e-d59e-4936-b0e1-3476f2d01242" (UID: "00bfff2e-d59e-4936-b0e1-3476f2d01242"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.258584 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00bfff2e-d59e-4936-b0e1-3476f2d01242" (UID: "00bfff2e-d59e-4936-b0e1-3476f2d01242"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.288430 4756 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.288462 4756 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/00bfff2e-d59e-4936-b0e1-3476f2d01242-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.288474 4756 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.288483 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bfff2e-d59e-4936-b0e1-3476f2d01242-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.288492 4756 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/00bfff2e-d59e-4936-b0e1-3476f2d01242-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.288500 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00bfff2e-d59e-4936-b0e1-3476f2d01242-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.288508 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbbf2\" (UniqueName: \"kubernetes.io/projected/00bfff2e-d59e-4936-b0e1-3476f2d01242-kube-api-access-nbbf2\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.728407 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2crm4" event={"ID":"00bfff2e-d59e-4936-b0e1-3476f2d01242","Type":"ContainerDied","Data":"7101c0a5e1051c878055df5e5be754a1b7f83f5ae2dddd5159e8c23e892c7d41"} Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.728466 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7101c0a5e1051c878055df5e5be754a1b7f83f5ae2dddd5159e8c23e892c7d41" Mar 18 14:20:53 crc kubenswrapper[4756]: I0318 14:20:53.728421 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2crm4" Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.094669 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7bbnt" Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.211817 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wrwg\" (UniqueName: \"kubernetes.io/projected/5dc602be-eea4-4a22-bfba-1f6319b32064-kube-api-access-2wrwg\") pod \"5dc602be-eea4-4a22-bfba-1f6319b32064\" (UID: \"5dc602be-eea4-4a22-bfba-1f6319b32064\") " Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.211938 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dc602be-eea4-4a22-bfba-1f6319b32064-operator-scripts\") pod \"5dc602be-eea4-4a22-bfba-1f6319b32064\" (UID: \"5dc602be-eea4-4a22-bfba-1f6319b32064\") " Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.214039 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dc602be-eea4-4a22-bfba-1f6319b32064-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5dc602be-eea4-4a22-bfba-1f6319b32064" (UID: "5dc602be-eea4-4a22-bfba-1f6319b32064"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.215305 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dc602be-eea4-4a22-bfba-1f6319b32064-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.222995 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc602be-eea4-4a22-bfba-1f6319b32064-kube-api-access-2wrwg" (OuterVolumeSpecName: "kube-api-access-2wrwg") pod "5dc602be-eea4-4a22-bfba-1f6319b32064" (UID: "5dc602be-eea4-4a22-bfba-1f6319b32064"). InnerVolumeSpecName "kube-api-access-2wrwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.248803 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.249255 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="prometheus" containerID="cri-o://fca1a509e14bcad5b9c44607728d8f04e898011fd7a5677443fc809dbc712992" gracePeriod=600 Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.249382 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="thanos-sidecar" containerID="cri-o://3034b4753cd3f1e18ef15ad501e7b2623f47aee37a1bb802c2840c6d9a89c495" gracePeriod=600 Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.249418 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="config-reloader" containerID="cri-o://d81a6b6abc89111b1be452ced5400dfe826560faf02cb4c78d822740570d0ae0" gracePeriod=600 Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.317198 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wrwg\" (UniqueName: \"kubernetes.io/projected/5dc602be-eea4-4a22-bfba-1f6319b32064-kube-api-access-2wrwg\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.739537 4756 generic.go:334] "Generic (PLEG): container finished" podID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerID="3034b4753cd3f1e18ef15ad501e7b2623f47aee37a1bb802c2840c6d9a89c495" exitCode=0 Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.739578 4756 generic.go:334] "Generic (PLEG): container finished" podID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerID="d81a6b6abc89111b1be452ced5400dfe826560faf02cb4c78d822740570d0ae0" exitCode=0 Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.739588 4756 generic.go:334] "Generic (PLEG): container finished" podID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerID="fca1a509e14bcad5b9c44607728d8f04e898011fd7a5677443fc809dbc712992" exitCode=0 Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.740533 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d691251-328b-4c09-98d1-4b968ab5bc05","Type":"ContainerDied","Data":"3034b4753cd3f1e18ef15ad501e7b2623f47aee37a1bb802c2840c6d9a89c495"} Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.740799 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d691251-328b-4c09-98d1-4b968ab5bc05","Type":"ContainerDied","Data":"d81a6b6abc89111b1be452ced5400dfe826560faf02cb4c78d822740570d0ae0"} Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.740924 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d691251-328b-4c09-98d1-4b968ab5bc05","Type":"ContainerDied","Data":"fca1a509e14bcad5b9c44607728d8f04e898011fd7a5677443fc809dbc712992"} Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.741541 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7bbnt" event={"ID":"5dc602be-eea4-4a22-bfba-1f6319b32064","Type":"ContainerDied","Data":"3c8aaca80cbba282ef0cc2e1c4c048a33fb983709a22388ef6dc34181f3c2c5f"} Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.741710 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c8aaca80cbba282ef0cc2e1c4c048a33fb983709a22388ef6dc34181f3c2c5f" Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.741722 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7bbnt" Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.743586 4756 generic.go:334] "Generic (PLEG): container finished" podID="3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" containerID="e66e72f8966d8210856953e9f931b81d3562014f300152eb8832f6595cff013c" exitCode=0 Mar 18 14:20:54 crc kubenswrapper[4756]: I0318 14:20:54.743623 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce","Type":"ContainerDied","Data":"e66e72f8966d8210856953e9f931b81d3562014f300152eb8832f6595cff013c"} Mar 18 14:20:56 crc kubenswrapper[4756]: I0318 14:20:56.770937 4756 generic.go:334] "Generic (PLEG): container finished" podID="228ca85e-a493-4dc4-9b95-5148c92ba228" containerID="9e1cdfeacb5d741083531a9b44627ebef31f79c1b82b6e7e2e4feb1175a89cf9" exitCode=0 Mar 18 14:20:56 crc kubenswrapper[4756]: I0318 14:20:56.771282 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"228ca85e-a493-4dc4-9b95-5148c92ba228","Type":"ContainerDied","Data":"9e1cdfeacb5d741083531a9b44627ebef31f79c1b82b6e7e2e4feb1175a89cf9"} Mar 18 14:20:58 crc kubenswrapper[4756]: I0318 14:20:58.839985 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="ca8fb9b6-a1a9-4781-af8f-2e7e78e62771" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 14:20:59 crc kubenswrapper[4756]: I0318 14:20:59.395172 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.116:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.065797 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.160697 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-2\") pod \"8d691251-328b-4c09-98d1-4b968ab5bc05\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.160751 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-0\") pod \"8d691251-328b-4c09-98d1-4b968ab5bc05\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.160808 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-thanos-prometheus-http-client-file\") pod \"8d691251-328b-4c09-98d1-4b968ab5bc05\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.160876 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-1\") pod \"8d691251-328b-4c09-98d1-4b968ab5bc05\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.160929 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzf7r\" (UniqueName: \"kubernetes.io/projected/8d691251-328b-4c09-98d1-4b968ab5bc05-kube-api-access-kzf7r\") pod \"8d691251-328b-4c09-98d1-4b968ab5bc05\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.161062 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdc3ac31-1c36-4851-94d7-354178e93595\") pod \"8d691251-328b-4c09-98d1-4b968ab5bc05\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.161098 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-web-config\") pod \"8d691251-328b-4c09-98d1-4b968ab5bc05\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.161156 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d691251-328b-4c09-98d1-4b968ab5bc05-config-out\") pod \"8d691251-328b-4c09-98d1-4b968ab5bc05\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.161206 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-config\") pod \"8d691251-328b-4c09-98d1-4b968ab5bc05\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.161255 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d691251-328b-4c09-98d1-4b968ab5bc05-tls-assets\") pod \"8d691251-328b-4c09-98d1-4b968ab5bc05\" (UID: \"8d691251-328b-4c09-98d1-4b968ab5bc05\") " Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.165104 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "8d691251-328b-4c09-98d1-4b968ab5bc05" (UID: "8d691251-328b-4c09-98d1-4b968ab5bc05"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.166448 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "8d691251-328b-4c09-98d1-4b968ab5bc05" (UID: "8d691251-328b-4c09-98d1-4b968ab5bc05"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.166936 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d691251-328b-4c09-98d1-4b968ab5bc05-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8d691251-328b-4c09-98d1-4b968ab5bc05" (UID: "8d691251-328b-4c09-98d1-4b968ab5bc05"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.169698 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d691251-328b-4c09-98d1-4b968ab5bc05-config-out" (OuterVolumeSpecName: "config-out") pod "8d691251-328b-4c09-98d1-4b968ab5bc05" (UID: "8d691251-328b-4c09-98d1-4b968ab5bc05"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.170827 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-config" (OuterVolumeSpecName: "config") pod "8d691251-328b-4c09-98d1-4b968ab5bc05" (UID: "8d691251-328b-4c09-98d1-4b968ab5bc05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.172411 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "8d691251-328b-4c09-98d1-4b968ab5bc05" (UID: "8d691251-328b-4c09-98d1-4b968ab5bc05"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.172822 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d691251-328b-4c09-98d1-4b968ab5bc05-kube-api-access-kzf7r" (OuterVolumeSpecName: "kube-api-access-kzf7r") pod "8d691251-328b-4c09-98d1-4b968ab5bc05" (UID: "8d691251-328b-4c09-98d1-4b968ab5bc05"). InnerVolumeSpecName "kube-api-access-kzf7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.176244 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "8d691251-328b-4c09-98d1-4b968ab5bc05" (UID: "8d691251-328b-4c09-98d1-4b968ab5bc05"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.192833 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdc3ac31-1c36-4851-94d7-354178e93595" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "8d691251-328b-4c09-98d1-4b968ab5bc05" (UID: "8d691251-328b-4c09-98d1-4b968ab5bc05"). InnerVolumeSpecName "pvc-fdc3ac31-1c36-4851-94d7-354178e93595". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.196416 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-web-config" (OuterVolumeSpecName: "web-config") pod "8d691251-328b-4c09-98d1-4b968ab5bc05" (UID: "8d691251-328b-4c09-98d1-4b968ab5bc05"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.263489 4756 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.263527 4756 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.263537 4756 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.263546 4756 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8d691251-328b-4c09-98d1-4b968ab5bc05-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.263557 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzf7r\" (UniqueName: \"kubernetes.io/projected/8d691251-328b-4c09-98d1-4b968ab5bc05-kube-api-access-kzf7r\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.263588 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-fdc3ac31-1c36-4851-94d7-354178e93595\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdc3ac31-1c36-4851-94d7-354178e93595\") on node \"crc\" " Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.263599 4756 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-web-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.263610 4756 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d691251-328b-4c09-98d1-4b968ab5bc05-config-out\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.263618 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d691251-328b-4c09-98d1-4b968ab5bc05-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.263627 4756 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d691251-328b-4c09-98d1-4b968ab5bc05-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.286938 4756 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.287134 4756 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-fdc3ac31-1c36-4851-94d7-354178e93595" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdc3ac31-1c36-4851-94d7-354178e93595") on node "crc" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.366146 4756 reconciler_common.go:293] "Volume detached for volume \"pvc-fdc3ac31-1c36-4851-94d7-354178e93595\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdc3ac31-1c36-4851-94d7-354178e93595\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.817957 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce","Type":"ContainerStarted","Data":"2553af4c815e015e2957c0e66da515607ec6a5ad37907645a3db8f52bc3adf5b"} Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.819163 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.822881 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d691251-328b-4c09-98d1-4b968ab5bc05","Type":"ContainerDied","Data":"9e2d53bb4e52ab0966a6dbb3cb60d42277781bd3918282f1d9b7431ffc45d86e"} Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.822911 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.822921 4756 scope.go:117] "RemoveContainer" containerID="3034b4753cd3f1e18ef15ad501e7b2623f47aee37a1bb802c2840c6d9a89c495" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.824481 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-g229b" event={"ID":"a6ee9241-87cd-43e5-90d2-869e14cc1eb6","Type":"ContainerStarted","Data":"7070ad1d9e9382acec4593cdb6ce1d60bbc3cef73a7bd05778c420838d0c1ebf"} Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.827414 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"228ca85e-a493-4dc4-9b95-5148c92ba228","Type":"ContainerStarted","Data":"7c17f87498ec948bd713efcefc461bae70853e835995551fae83a4dd7fe7ecd1"} Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.828173 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.848316 4756 scope.go:117] "RemoveContainer" containerID="d81a6b6abc89111b1be452ced5400dfe826560faf02cb4c78d822740570d0ae0" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.851245 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=67.254953391 podStartE2EDuration="1m18.851232223s" podCreationTimestamp="2026-03-18 14:19:43 +0000 UTC" firstStartedPulling="2026-03-18 14:20:00.235823661 +0000 UTC m=+1201.550241636" lastFinishedPulling="2026-03-18 14:20:11.832102463 +0000 UTC m=+1213.146520468" observedRunningTime="2026-03-18 14:21:01.841247464 +0000 UTC m=+1263.155665449" watchObservedRunningTime="2026-03-18 14:21:01.851232223 +0000 UTC m=+1263.165650198" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.874848 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.894180 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.896533 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=61.205096039 podStartE2EDuration="1m18.896517078s" podCreationTimestamp="2026-03-18 14:19:43 +0000 UTC" firstStartedPulling="2026-03-18 14:20:01.651814078 +0000 UTC m=+1202.966232053" lastFinishedPulling="2026-03-18 14:20:19.343235117 +0000 UTC m=+1220.657653092" observedRunningTime="2026-03-18 14:21:01.892971132 +0000 UTC m=+1263.207389107" watchObservedRunningTime="2026-03-18 14:21:01.896517078 +0000 UTC m=+1263.210935053" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.914951 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 14:21:01 crc kubenswrapper[4756]: E0318 14:21:01.915274 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bfff2e-d59e-4936-b0e1-3476f2d01242" containerName="swift-ring-rebalance" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.915290 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bfff2e-d59e-4936-b0e1-3476f2d01242" containerName="swift-ring-rebalance" Mar 18 14:21:01 crc kubenswrapper[4756]: E0318 14:21:01.915303 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc602be-eea4-4a22-bfba-1f6319b32064" containerName="mariadb-account-create-update" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.915310 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc602be-eea4-4a22-bfba-1f6319b32064" containerName="mariadb-account-create-update" Mar 18 14:21:01 crc kubenswrapper[4756]: E0318 14:21:01.915324 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="init-config-reloader" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.915330 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="init-config-reloader" Mar 18 14:21:01 crc kubenswrapper[4756]: E0318 14:21:01.915343 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="prometheus" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.915352 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="prometheus" Mar 18 14:21:01 crc kubenswrapper[4756]: E0318 14:21:01.915361 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="thanos-sidecar" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.915368 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="thanos-sidecar" Mar 18 14:21:01 crc kubenswrapper[4756]: E0318 14:21:01.915386 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="config-reloader" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.915393 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="config-reloader" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.915547 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc602be-eea4-4a22-bfba-1f6319b32064" containerName="mariadb-account-create-update" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.915555 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="thanos-sidecar" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.915565 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="00bfff2e-d59e-4936-b0e1-3476f2d01242" containerName="swift-ring-rebalance" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.915579 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="prometheus" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.915590 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" containerName="config-reloader" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.917048 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.924043 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-g229b" podStartSLOduration=2.503353637 podStartE2EDuration="13.924025812s" podCreationTimestamp="2026-03-18 14:20:48 +0000 UTC" firstStartedPulling="2026-03-18 14:20:49.539267219 +0000 UTC m=+1250.853685204" lastFinishedPulling="2026-03-18 14:21:00.959939404 +0000 UTC m=+1262.274357379" observedRunningTime="2026-03-18 14:21:01.914570536 +0000 UTC m=+1263.228988511" watchObservedRunningTime="2026-03-18 14:21:01.924025812 +0000 UTC m=+1263.238443787" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.929617 4756 scope.go:117] "RemoveContainer" containerID="fca1a509e14bcad5b9c44607728d8f04e898011fd7a5677443fc809dbc712992" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.930430 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.930653 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.930969 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.931060 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.931980 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.932109 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kdkt7" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.932224 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.932344 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.934980 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.955271 4756 scope.go:117] "RemoveContainer" containerID="a1ff35615a64ea0b0976834b6197c47f1c1b410ff679b5b20035e885f047d65c" Mar 18 14:21:01 crc kubenswrapper[4756]: I0318 14:21:01.955938 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.087825 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.087883 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.087911 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fdc3ac31-1c36-4851-94d7-354178e93595\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdc3ac31-1c36-4851-94d7-354178e93595\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.087984 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.088004 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.088026 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d3674269-04c7-45df-ad72-38d1bb5aab93-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.088043 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d3674269-04c7-45df-ad72-38d1bb5aab93-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.088059 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d3674269-04c7-45df-ad72-38d1bb5aab93-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.088073 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82vmd\" (UniqueName: \"kubernetes.io/projected/d3674269-04c7-45df-ad72-38d1bb5aab93-kube-api-access-82vmd\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.088094 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d3674269-04c7-45df-ad72-38d1bb5aab93-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.088136 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d3674269-04c7-45df-ad72-38d1bb5aab93-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.088151 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-config\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.088181 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.189346 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d3674269-04c7-45df-ad72-38d1bb5aab93-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.189383 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-config\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.189420 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.189452 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.189486 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.189509 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fdc3ac31-1c36-4851-94d7-354178e93595\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdc3ac31-1c36-4851-94d7-354178e93595\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.189584 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.189618 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.189641 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d3674269-04c7-45df-ad72-38d1bb5aab93-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.189660 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d3674269-04c7-45df-ad72-38d1bb5aab93-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.189681 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d3674269-04c7-45df-ad72-38d1bb5aab93-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.189697 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82vmd\" (UniqueName: \"kubernetes.io/projected/d3674269-04c7-45df-ad72-38d1bb5aab93-kube-api-access-82vmd\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.189720 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d3674269-04c7-45df-ad72-38d1bb5aab93-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.190627 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d3674269-04c7-45df-ad72-38d1bb5aab93-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.190663 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d3674269-04c7-45df-ad72-38d1bb5aab93-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.193684 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d3674269-04c7-45df-ad72-38d1bb5aab93-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.195518 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.195556 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d3674269-04c7-45df-ad72-38d1bb5aab93-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.196632 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d3674269-04c7-45df-ad72-38d1bb5aab93-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.196950 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.197335 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-config\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.197643 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.198215 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.198243 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fdc3ac31-1c36-4851-94d7-354178e93595\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdc3ac31-1c36-4851-94d7-354178e93595\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/810f25a37f71de248104173ceeca717706568db9cb74ba5fdae93f590561981a/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.198890 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.209279 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82vmd\" (UniqueName: \"kubernetes.io/projected/d3674269-04c7-45df-ad72-38d1bb5aab93-kube-api-access-82vmd\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.207897 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d3674269-04c7-45df-ad72-38d1bb5aab93-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.249617 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fdc3ac31-1c36-4851-94d7-354178e93595\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fdc3ac31-1c36-4851-94d7-354178e93595\") pod \"prometheus-metric-storage-0\" (UID: \"d3674269-04c7-45df-ad72-38d1bb5aab93\") " pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.256376 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.688192 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 14:21:02 crc kubenswrapper[4756]: W0318 14:21:02.694946 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3674269_04c7_45df_ad72_38d1bb5aab93.slice/crio-5c0c3e90f6920a180ecd7b3fbaf1c7c8d8205bb472dc254f505cedd6e637d7a3 WatchSource:0}: Error finding container 5c0c3e90f6920a180ecd7b3fbaf1c7c8d8205bb472dc254f505cedd6e637d7a3: Status 404 returned error can't find the container with id 5c0c3e90f6920a180ecd7b3fbaf1c7c8d8205bb472dc254f505cedd6e637d7a3 Mar 18 14:21:02 crc kubenswrapper[4756]: I0318 14:21:02.837274 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d3674269-04c7-45df-ad72-38d1bb5aab93","Type":"ContainerStarted","Data":"5c0c3e90f6920a180ecd7b3fbaf1c7c8d8205bb472dc254f505cedd6e637d7a3"} Mar 18 14:21:03 crc kubenswrapper[4756]: I0318 14:21:03.105330 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:21:03 crc kubenswrapper[4756]: I0318 14:21:03.111417 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e4d16-f5d1-4367-ad0e-baf820923225-etc-swift\") pod \"swift-storage-0\" (UID: \"717e4d16-f5d1-4367-ad0e-baf820923225\") " pod="openstack/swift-storage-0" Mar 18 14:21:03 crc kubenswrapper[4756]: I0318 14:21:03.328903 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d691251-328b-4c09-98d1-4b968ab5bc05" path="/var/lib/kubelet/pods/8d691251-328b-4c09-98d1-4b968ab5bc05/volumes" Mar 18 14:21:03 crc kubenswrapper[4756]: I0318 14:21:03.391214 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 14:21:04 crc kubenswrapper[4756]: I0318 14:21:04.035925 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 14:21:04 crc kubenswrapper[4756]: I0318 14:21:04.080063 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-r6s8c" podUID="99dfb896-59f3-4f93-8d0e-4b19b49cbc56" containerName="ovn-controller" probeResult="failure" output=< Mar 18 14:21:04 crc kubenswrapper[4756]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 14:21:04 crc kubenswrapper[4756]: > Mar 18 14:21:04 crc kubenswrapper[4756]: I0318 14:21:04.084411 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:21:04 crc kubenswrapper[4756]: I0318 14:21:04.858859 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"2bf742487004bb26d368e1a3daff4531e6238464acc4bb8e29921728437b3e33"} Mar 18 14:21:05 crc kubenswrapper[4756]: I0318 14:21:05.870167 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d3674269-04c7-45df-ad72-38d1bb5aab93","Type":"ContainerStarted","Data":"8a11330839b99e6201f15cac1838ee58f445cc8860a79e8e28ce2a22ee8c150f"} Mar 18 14:21:05 crc kubenswrapper[4756]: I0318 14:21:05.872343 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"379c1ead9f38849965d224ebe3bb5c9ccabff570fd41934e3e6ef8c78e2fd49f"} Mar 18 14:21:05 crc kubenswrapper[4756]: I0318 14:21:05.872382 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"50dc29973657023d91c5289b661431a1d69a17ea133320b0b88d2ef3f711bf22"} Mar 18 14:21:05 crc kubenswrapper[4756]: I0318 14:21:05.872391 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"6b24a0f92561e82f15a6deaf4290552129f0c082fd9910989b20706349504221"} Mar 18 14:21:06 crc kubenswrapper[4756]: I0318 14:21:06.883861 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"870e4018a451c8fecc0aae7aa201d293808fb93ed9c2b5916a0b57376cad35f0"} Mar 18 14:21:07 crc kubenswrapper[4756]: I0318 14:21:07.898274 4756 generic.go:334] "Generic (PLEG): container finished" podID="a6ee9241-87cd-43e5-90d2-869e14cc1eb6" containerID="7070ad1d9e9382acec4593cdb6ce1d60bbc3cef73a7bd05778c420838d0c1ebf" exitCode=0 Mar 18 14:21:07 crc kubenswrapper[4756]: I0318 14:21:07.898358 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-g229b" event={"ID":"a6ee9241-87cd-43e5-90d2-869e14cc1eb6","Type":"ContainerDied","Data":"7070ad1d9e9382acec4593cdb6ce1d60bbc3cef73a7bd05778c420838d0c1ebf"} Mar 18 14:21:07 crc kubenswrapper[4756]: I0318 14:21:07.906018 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"26159bb955b945af68372b06df50907ca135519c7984215ac2088c8ffc4efb15"} Mar 18 14:21:07 crc kubenswrapper[4756]: I0318 14:21:07.906363 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"9027b89517840a87208a12250c2d3f157e7e06bee9c4d24cbd435e3b179bccbe"} Mar 18 14:21:07 crc kubenswrapper[4756]: I0318 14:21:07.906373 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"4671a9e3da5505969ac2e83087aaae55c2dc0e53e075f7abd56ba4e8f48e38ba"} Mar 18 14:21:07 crc kubenswrapper[4756]: I0318 14:21:07.906382 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"09bede52197f33667a779e557f026b976e9f041e3fb1645eae65f1d52889b2ea"} Mar 18 14:21:08 crc kubenswrapper[4756]: I0318 14:21:08.840511 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="ca8fb9b6-a1a9-4781-af8f-2e7e78e62771" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.089588 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vsj9n" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.090720 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-r6s8c" podUID="99dfb896-59f3-4f93-8d0e-4b19b49cbc56" containerName="ovn-controller" probeResult="failure" output=< Mar 18 14:21:09 crc kubenswrapper[4756]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 14:21:09 crc kubenswrapper[4756]: > Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.351308 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r6s8c-config-5j8gs"] Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.357878 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.364607 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.379136 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r6s8c-config-5j8gs"] Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.428923 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5a4de511-0ebb-420b-a1c0-249ed61997c5-additional-scripts\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.428966 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-run\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.429000 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a4de511-0ebb-420b-a1c0-249ed61997c5-scripts\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.429020 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-run-ovn\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.429091 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpzjw\" (UniqueName: \"kubernetes.io/projected/5a4de511-0ebb-420b-a1c0-249ed61997c5-kube-api-access-rpzjw\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.429130 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-log-ovn\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.442288 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-g229b" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.530151 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpzjw\" (UniqueName: \"kubernetes.io/projected/5a4de511-0ebb-420b-a1c0-249ed61997c5-kube-api-access-rpzjw\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.530195 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-log-ovn\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.530296 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5a4de511-0ebb-420b-a1c0-249ed61997c5-additional-scripts\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.530315 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-run\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.530344 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a4de511-0ebb-420b-a1c0-249ed61997c5-scripts\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.530361 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-run-ovn\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.530559 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-run-ovn\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.530877 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-log-ovn\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.531083 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-run\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.531444 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5a4de511-0ebb-420b-a1c0-249ed61997c5-additional-scripts\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.533654 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a4de511-0ebb-420b-a1c0-249ed61997c5-scripts\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.547158 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpzjw\" (UniqueName: \"kubernetes.io/projected/5a4de511-0ebb-420b-a1c0-249ed61997c5-kube-api-access-rpzjw\") pod \"ovn-controller-r6s8c-config-5j8gs\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.632326 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-db-sync-config-data\") pod \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.632524 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-combined-ca-bundle\") pod \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.632579 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-config-data\") pod \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.632723 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjfwp\" (UniqueName: \"kubernetes.io/projected/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-kube-api-access-xjfwp\") pod \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\" (UID: \"a6ee9241-87cd-43e5-90d2-869e14cc1eb6\") " Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.643319 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a6ee9241-87cd-43e5-90d2-869e14cc1eb6" (UID: "a6ee9241-87cd-43e5-90d2-869e14cc1eb6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.643377 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-kube-api-access-xjfwp" (OuterVolumeSpecName: "kube-api-access-xjfwp") pod "a6ee9241-87cd-43e5-90d2-869e14cc1eb6" (UID: "a6ee9241-87cd-43e5-90d2-869e14cc1eb6"). InnerVolumeSpecName "kube-api-access-xjfwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.672850 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6ee9241-87cd-43e5-90d2-869e14cc1eb6" (UID: "a6ee9241-87cd-43e5-90d2-869e14cc1eb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.688597 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-config-data" (OuterVolumeSpecName: "config-data") pod "a6ee9241-87cd-43e5-90d2-869e14cc1eb6" (UID: "a6ee9241-87cd-43e5-90d2-869e14cc1eb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.734346 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.734380 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.734389 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjfwp\" (UniqueName: \"kubernetes.io/projected/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-kube-api-access-xjfwp\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.734399 4756 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a6ee9241-87cd-43e5-90d2-869e14cc1eb6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.754789 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.961588 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"dbfab37adf8266039dfffd6636b4fb8f757031a82881ab7b614429a7987ce5e3"} Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.961891 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"4dbc4f9140016b504fef1029f4ecc1cdd47c6225a0c02b31ba214979f8e844fc"} Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.961903 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"9f51983301854caa994ba18c18d8fd1787ae1ca77e8970933ede47a37a6dfc14"} Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.961912 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"8326487eb464b801ace3de252373f46eb6239cce4ab1438f96194c42da3ef464"} Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.961922 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"c9ba96207ad157be28f0a7090ed32793e0442744529934ff1f372bb95508bc3a"} Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.974445 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-g229b" event={"ID":"a6ee9241-87cd-43e5-90d2-869e14cc1eb6","Type":"ContainerDied","Data":"6c1227b0ece1cca7a7acbd0ccd1734a9b7b15c618c99954f89ba014c89bfcf33"} Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.974482 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c1227b0ece1cca7a7acbd0ccd1734a9b7b15c618c99954f89ba014c89bfcf33" Mar 18 14:21:09 crc kubenswrapper[4756]: I0318 14:21:09.974533 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-g229b" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.230589 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-j5ckz"] Mar 18 14:21:10 crc kubenswrapper[4756]: E0318 14:21:10.230972 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ee9241-87cd-43e5-90d2-869e14cc1eb6" containerName="glance-db-sync" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.230985 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ee9241-87cd-43e5-90d2-869e14cc1eb6" containerName="glance-db-sync" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.231169 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ee9241-87cd-43e5-90d2-869e14cc1eb6" containerName="glance-db-sync" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.232090 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.247406 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-j5ckz"] Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.249057 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.249130 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.249152 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8zvx\" (UniqueName: \"kubernetes.io/projected/a835ae65-0067-4f1d-92cc-26f7794c535c-kube-api-access-g8zvx\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.249178 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-config\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.249206 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.268470 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r6s8c-config-5j8gs"] Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.350932 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.350997 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.351024 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8zvx\" (UniqueName: \"kubernetes.io/projected/a835ae65-0067-4f1d-92cc-26f7794c535c-kube-api-access-g8zvx\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.351054 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-config\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.351083 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.351953 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.352431 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-config\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.352794 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.357872 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.372594 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8zvx\" (UniqueName: \"kubernetes.io/projected/a835ae65-0067-4f1d-92cc-26f7794c535c-kube-api-access-g8zvx\") pod \"dnsmasq-dns-5b946c75cc-j5ckz\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.625915 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.997436 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"3196d06a8a8f97cfc0b8b13734ef40e2b20986ac372ed3ad78e7a70b8aa212cf"} Mar 18 14:21:10 crc kubenswrapper[4756]: I0318 14:21:10.997801 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e4d16-f5d1-4367-ad0e-baf820923225","Type":"ContainerStarted","Data":"5f0132568c6ddf94bc19883499ce5bc2d8965fadc62ad44970c4ddf60f6b7c13"} Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:10.999955 4756 generic.go:334] "Generic (PLEG): container finished" podID="5a4de511-0ebb-420b-a1c0-249ed61997c5" containerID="df35257fe5b149f9fdb55e31ca98d4f6848cea9f144339254af9a623836e2d60" exitCode=0 Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:10.999996 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r6s8c-config-5j8gs" event={"ID":"5a4de511-0ebb-420b-a1c0-249ed61997c5","Type":"ContainerDied","Data":"df35257fe5b149f9fdb55e31ca98d4f6848cea9f144339254af9a623836e2d60"} Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.000019 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r6s8c-config-5j8gs" event={"ID":"5a4de511-0ebb-420b-a1c0-249ed61997c5","Type":"ContainerStarted","Data":"84f7059fa0201158015dd203752b20f886e9888fc288a8616c1a34e4291cac83"} Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.043061 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.27987658 podStartE2EDuration="41.043035881s" podCreationTimestamp="2026-03-18 14:20:30 +0000 UTC" firstStartedPulling="2026-03-18 14:21:04.038149366 +0000 UTC m=+1265.352567351" lastFinishedPulling="2026-03-18 14:21:08.801308657 +0000 UTC m=+1270.115726652" observedRunningTime="2026-03-18 14:21:11.033906154 +0000 UTC m=+1272.348324129" watchObservedRunningTime="2026-03-18 14:21:11.043035881 +0000 UTC m=+1272.357453856" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.095931 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-j5ckz"] Mar 18 14:21:11 crc kubenswrapper[4756]: W0318 14:21:11.099478 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda835ae65_0067_4f1d_92cc_26f7794c535c.slice/crio-bbc40aa865b41f849c3fefd89cfc0d194556acf0a4c97d72e5e1acf915595f99 WatchSource:0}: Error finding container bbc40aa865b41f849c3fefd89cfc0d194556acf0a4c97d72e5e1acf915595f99: Status 404 returned error can't find the container with id bbc40aa865b41f849c3fefd89cfc0d194556acf0a4c97d72e5e1acf915595f99 Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.395464 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-j5ckz"] Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.446973 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-bx844"] Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.448602 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.450718 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.460232 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-bx844"] Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.573452 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d25d\" (UniqueName: \"kubernetes.io/projected/c569a795-118f-49c7-850d-798474e0b461-kube-api-access-5d25d\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.573509 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.573572 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.573614 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.573637 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.573677 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-config\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.674993 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-config\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.675098 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d25d\" (UniqueName: \"kubernetes.io/projected/c569a795-118f-49c7-850d-798474e0b461-kube-api-access-5d25d\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.675144 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.675193 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.675238 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.675270 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.676404 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.676514 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.676562 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.677574 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.677673 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-config\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.693692 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d25d\" (UniqueName: \"kubernetes.io/projected/c569a795-118f-49c7-850d-798474e0b461-kube-api-access-5d25d\") pod \"dnsmasq-dns-74f6bcbc87-bx844\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:11 crc kubenswrapper[4756]: I0318 14:21:11.763532 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.014503 4756 generic.go:334] "Generic (PLEG): container finished" podID="a835ae65-0067-4f1d-92cc-26f7794c535c" containerID="43ab81be4f3a92e87732d2cc9a1405b3805c844c9b0010fac13932dd68faf8c1" exitCode=0 Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.014599 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" event={"ID":"a835ae65-0067-4f1d-92cc-26f7794c535c","Type":"ContainerDied","Data":"43ab81be4f3a92e87732d2cc9a1405b3805c844c9b0010fac13932dd68faf8c1"} Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.016161 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" event={"ID":"a835ae65-0067-4f1d-92cc-26f7794c535c","Type":"ContainerStarted","Data":"bbc40aa865b41f849c3fefd89cfc0d194556acf0a4c97d72e5e1acf915595f99"} Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.025942 4756 generic.go:334] "Generic (PLEG): container finished" podID="d3674269-04c7-45df-ad72-38d1bb5aab93" containerID="8a11330839b99e6201f15cac1838ee58f445cc8860a79e8e28ce2a22ee8c150f" exitCode=0 Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.026006 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d3674269-04c7-45df-ad72-38d1bb5aab93","Type":"ContainerDied","Data":"8a11330839b99e6201f15cac1838ee58f445cc8860a79e8e28ce2a22ee8c150f"} Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.236272 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-bx844"] Mar 18 14:21:12 crc kubenswrapper[4756]: W0318 14:21:12.242594 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc569a795_118f_49c7_850d_798474e0b461.slice/crio-8149f0290a3789951e83ee56114fd62d6b65b8857e52fd0a4402e209458c23f5 WatchSource:0}: Error finding container 8149f0290a3789951e83ee56114fd62d6b65b8857e52fd0a4402e209458c23f5: Status 404 returned error can't find the container with id 8149f0290a3789951e83ee56114fd62d6b65b8857e52fd0a4402e209458c23f5 Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.443965 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.454799 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.592921 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-ovsdbserver-nb\") pod \"a835ae65-0067-4f1d-92cc-26f7794c535c\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.592970 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-log-ovn\") pod \"5a4de511-0ebb-420b-a1c0-249ed61997c5\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.593048 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a4de511-0ebb-420b-a1c0-249ed61997c5-scripts\") pod \"5a4de511-0ebb-420b-a1c0-249ed61997c5\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.593075 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-ovsdbserver-sb\") pod \"a835ae65-0067-4f1d-92cc-26f7794c535c\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.593089 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-run-ovn\") pod \"5a4de511-0ebb-420b-a1c0-249ed61997c5\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.593182 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8zvx\" (UniqueName: \"kubernetes.io/projected/a835ae65-0067-4f1d-92cc-26f7794c535c-kube-api-access-g8zvx\") pod \"a835ae65-0067-4f1d-92cc-26f7794c535c\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.593179 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5a4de511-0ebb-420b-a1c0-249ed61997c5" (UID: "5a4de511-0ebb-420b-a1c0-249ed61997c5"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.593203 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-dns-svc\") pod \"a835ae65-0067-4f1d-92cc-26f7794c535c\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.593237 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-config\") pod \"a835ae65-0067-4f1d-92cc-26f7794c535c\" (UID: \"a835ae65-0067-4f1d-92cc-26f7794c535c\") " Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.593256 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpzjw\" (UniqueName: \"kubernetes.io/projected/5a4de511-0ebb-420b-a1c0-249ed61997c5-kube-api-access-rpzjw\") pod \"5a4de511-0ebb-420b-a1c0-249ed61997c5\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.593293 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-run\") pod \"5a4de511-0ebb-420b-a1c0-249ed61997c5\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.593346 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5a4de511-0ebb-420b-a1c0-249ed61997c5-additional-scripts\") pod \"5a4de511-0ebb-420b-a1c0-249ed61997c5\" (UID: \"5a4de511-0ebb-420b-a1c0-249ed61997c5\") " Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.593863 4756 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.593234 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5a4de511-0ebb-420b-a1c0-249ed61997c5" (UID: "5a4de511-0ebb-420b-a1c0-249ed61997c5"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.594142 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-run" (OuterVolumeSpecName: "var-run") pod "5a4de511-0ebb-420b-a1c0-249ed61997c5" (UID: "5a4de511-0ebb-420b-a1c0-249ed61997c5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.594441 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a4de511-0ebb-420b-a1c0-249ed61997c5-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5a4de511-0ebb-420b-a1c0-249ed61997c5" (UID: "5a4de511-0ebb-420b-a1c0-249ed61997c5"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.595217 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a4de511-0ebb-420b-a1c0-249ed61997c5-scripts" (OuterVolumeSpecName: "scripts") pod "5a4de511-0ebb-420b-a1c0-249ed61997c5" (UID: "5a4de511-0ebb-420b-a1c0-249ed61997c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.597739 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a835ae65-0067-4f1d-92cc-26f7794c535c-kube-api-access-g8zvx" (OuterVolumeSpecName: "kube-api-access-g8zvx") pod "a835ae65-0067-4f1d-92cc-26f7794c535c" (UID: "a835ae65-0067-4f1d-92cc-26f7794c535c"). InnerVolumeSpecName "kube-api-access-g8zvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.601056 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a4de511-0ebb-420b-a1c0-249ed61997c5-kube-api-access-rpzjw" (OuterVolumeSpecName: "kube-api-access-rpzjw") pod "5a4de511-0ebb-420b-a1c0-249ed61997c5" (UID: "5a4de511-0ebb-420b-a1c0-249ed61997c5"). InnerVolumeSpecName "kube-api-access-rpzjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.615429 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-config" (OuterVolumeSpecName: "config") pod "a835ae65-0067-4f1d-92cc-26f7794c535c" (UID: "a835ae65-0067-4f1d-92cc-26f7794c535c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.619383 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a835ae65-0067-4f1d-92cc-26f7794c535c" (UID: "a835ae65-0067-4f1d-92cc-26f7794c535c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.626345 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a835ae65-0067-4f1d-92cc-26f7794c535c" (UID: "a835ae65-0067-4f1d-92cc-26f7794c535c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.632560 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a835ae65-0067-4f1d-92cc-26f7794c535c" (UID: "a835ae65-0067-4f1d-92cc-26f7794c535c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.696422 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8zvx\" (UniqueName: \"kubernetes.io/projected/a835ae65-0067-4f1d-92cc-26f7794c535c-kube-api-access-g8zvx\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.696475 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.696495 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.696514 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpzjw\" (UniqueName: \"kubernetes.io/projected/5a4de511-0ebb-420b-a1c0-249ed61997c5-kube-api-access-rpzjw\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.696533 4756 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.696550 4756 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5a4de511-0ebb-420b-a1c0-249ed61997c5-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.696569 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.696586 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a4de511-0ebb-420b-a1c0-249ed61997c5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.696603 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a835ae65-0067-4f1d-92cc-26f7794c535c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:12 crc kubenswrapper[4756]: I0318 14:21:12.696621 4756 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a4de511-0ebb-420b-a1c0-249ed61997c5-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.037222 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d3674269-04c7-45df-ad72-38d1bb5aab93","Type":"ContainerStarted","Data":"a99c2f4f45cfc80193dca9c7550d8cbf7eebda4b3a37e1c71e802dd6012b8427"} Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.039408 4756 generic.go:334] "Generic (PLEG): container finished" podID="c569a795-118f-49c7-850d-798474e0b461" containerID="f50813530ac27105a5dd2da41073898334f1bc8e57d331b5c78e4bac0f358d14" exitCode=0 Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.039903 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" event={"ID":"c569a795-118f-49c7-850d-798474e0b461","Type":"ContainerDied","Data":"f50813530ac27105a5dd2da41073898334f1bc8e57d331b5c78e4bac0f358d14"} Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.040485 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" event={"ID":"c569a795-118f-49c7-850d-798474e0b461","Type":"ContainerStarted","Data":"8149f0290a3789951e83ee56114fd62d6b65b8857e52fd0a4402e209458c23f5"} Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.041811 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" event={"ID":"a835ae65-0067-4f1d-92cc-26f7794c535c","Type":"ContainerDied","Data":"bbc40aa865b41f849c3fefd89cfc0d194556acf0a4c97d72e5e1acf915595f99"} Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.042023 4756 scope.go:117] "RemoveContainer" containerID="43ab81be4f3a92e87732d2cc9a1405b3805c844c9b0010fac13932dd68faf8c1" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.041881 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-j5ckz" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.043858 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r6s8c-config-5j8gs" event={"ID":"5a4de511-0ebb-420b-a1c0-249ed61997c5","Type":"ContainerDied","Data":"84f7059fa0201158015dd203752b20f886e9888fc288a8616c1a34e4291cac83"} Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.043924 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84f7059fa0201158015dd203752b20f886e9888fc288a8616c1a34e4291cac83" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.044002 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r6s8c-config-5j8gs" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.225929 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-j5ckz"] Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.233890 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-j5ckz"] Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.325514 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a835ae65-0067-4f1d-92cc-26f7794c535c" path="/var/lib/kubelet/pods/a835ae65-0067-4f1d-92cc-26f7794c535c/volumes" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.565256 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-r6s8c-config-5j8gs"] Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.572932 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-r6s8c-config-5j8gs"] Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.662786 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r6s8c-config-fxq89"] Mar 18 14:21:13 crc kubenswrapper[4756]: E0318 14:21:13.663161 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4de511-0ebb-420b-a1c0-249ed61997c5" containerName="ovn-config" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.663179 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4de511-0ebb-420b-a1c0-249ed61997c5" containerName="ovn-config" Mar 18 14:21:13 crc kubenswrapper[4756]: E0318 14:21:13.663203 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a835ae65-0067-4f1d-92cc-26f7794c535c" containerName="init" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.663210 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a835ae65-0067-4f1d-92cc-26f7794c535c" containerName="init" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.663363 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a4de511-0ebb-420b-a1c0-249ed61997c5" containerName="ovn-config" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.663388 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a835ae65-0067-4f1d-92cc-26f7794c535c" containerName="init" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.663964 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.667047 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.671950 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r6s8c-config-fxq89"] Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.818772 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-log-ovn\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.818841 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-run\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.818898 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12e57d08-4793-42ee-b2dc-a3d7b6828f23-scripts\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.818957 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12e57d08-4793-42ee-b2dc-a3d7b6828f23-additional-scripts\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.818991 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbdzw\" (UniqueName: \"kubernetes.io/projected/12e57d08-4793-42ee-b2dc-a3d7b6828f23-kube-api-access-wbdzw\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.819055 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-run-ovn\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.920411 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-run-ovn\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.920507 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-log-ovn\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.920529 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-run\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.920559 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12e57d08-4793-42ee-b2dc-a3d7b6828f23-scripts\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.920597 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12e57d08-4793-42ee-b2dc-a3d7b6828f23-additional-scripts\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.920631 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbdzw\" (UniqueName: \"kubernetes.io/projected/12e57d08-4793-42ee-b2dc-a3d7b6828f23-kube-api-access-wbdzw\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.920737 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-run-ovn\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.920802 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-log-ovn\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.920898 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-run\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.921553 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12e57d08-4793-42ee-b2dc-a3d7b6828f23-additional-scripts\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.922797 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12e57d08-4793-42ee-b2dc-a3d7b6828f23-scripts\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.941237 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbdzw\" (UniqueName: \"kubernetes.io/projected/12e57d08-4793-42ee-b2dc-a3d7b6828f23-kube-api-access-wbdzw\") pod \"ovn-controller-r6s8c-config-fxq89\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:13 crc kubenswrapper[4756]: I0318 14:21:13.981494 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:14 crc kubenswrapper[4756]: I0318 14:21:14.070027 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" event={"ID":"c569a795-118f-49c7-850d-798474e0b461","Type":"ContainerStarted","Data":"3b6bf8cab0f4a060d8b68d1b475d2350915f50ed8aa12154268eba9fa4a5b39f"} Mar 18 14:21:14 crc kubenswrapper[4756]: I0318 14:21:14.071488 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:14 crc kubenswrapper[4756]: I0318 14:21:14.079254 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-r6s8c" Mar 18 14:21:14 crc kubenswrapper[4756]: I0318 14:21:14.099130 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" podStartSLOduration=3.099100304 podStartE2EDuration="3.099100304s" podCreationTimestamp="2026-03-18 14:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:21:14.089894175 +0000 UTC m=+1275.404312150" watchObservedRunningTime="2026-03-18 14:21:14.099100304 +0000 UTC m=+1275.413518279" Mar 18 14:21:14 crc kubenswrapper[4756]: I0318 14:21:14.421599 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r6s8c-config-fxq89"] Mar 18 14:21:14 crc kubenswrapper[4756]: W0318 14:21:14.422735 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12e57d08_4793_42ee_b2dc_a3d7b6828f23.slice/crio-731fb619fedc30e949563fb4352f8a38f67ae6da11bc11452293eb4e49d1caca WatchSource:0}: Error finding container 731fb619fedc30e949563fb4352f8a38f67ae6da11bc11452293eb4e49d1caca: Status 404 returned error can't find the container with id 731fb619fedc30e949563fb4352f8a38f67ae6da11bc11452293eb4e49d1caca Mar 18 14:21:14 crc kubenswrapper[4756]: I0318 14:21:14.650275 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:21:14 crc kubenswrapper[4756]: I0318 14:21:14.969962 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 14:21:15 crc kubenswrapper[4756]: I0318 14:21:15.087036 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r6s8c-config-fxq89" event={"ID":"12e57d08-4793-42ee-b2dc-a3d7b6828f23","Type":"ContainerStarted","Data":"7e7793b6e1ef38a798923bc73c219a940941c5e9089015eda346a7411fa3fae1"} Mar 18 14:21:15 crc kubenswrapper[4756]: I0318 14:21:15.087092 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r6s8c-config-fxq89" event={"ID":"12e57d08-4793-42ee-b2dc-a3d7b6828f23","Type":"ContainerStarted","Data":"731fb619fedc30e949563fb4352f8a38f67ae6da11bc11452293eb4e49d1caca"} Mar 18 14:21:15 crc kubenswrapper[4756]: I0318 14:21:15.107890 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-r6s8c-config-fxq89" podStartSLOduration=2.10785322 podStartE2EDuration="2.10785322s" podCreationTimestamp="2026-03-18 14:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:21:15.101720314 +0000 UTC m=+1276.416138289" watchObservedRunningTime="2026-03-18 14:21:15.10785322 +0000 UTC m=+1276.422271195" Mar 18 14:21:15 crc kubenswrapper[4756]: I0318 14:21:15.331905 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a4de511-0ebb-420b-a1c0-249ed61997c5" path="/var/lib/kubelet/pods/5a4de511-0ebb-420b-a1c0-249ed61997c5/volumes" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.098624 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d3674269-04c7-45df-ad72-38d1bb5aab93","Type":"ContainerStarted","Data":"928e7f754251fcce1494404e1261d50f0c5ec931829502acd2b5967d5688f0ca"} Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.098987 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d3674269-04c7-45df-ad72-38d1bb5aab93","Type":"ContainerStarted","Data":"9a2f6b2579eda4036c62922ce35c902f231de4a7b5e62a0284571a550e718baa"} Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.101017 4756 generic.go:334] "Generic (PLEG): container finished" podID="12e57d08-4793-42ee-b2dc-a3d7b6828f23" containerID="7e7793b6e1ef38a798923bc73c219a940941c5e9089015eda346a7411fa3fae1" exitCode=0 Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.101150 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r6s8c-config-fxq89" event={"ID":"12e57d08-4793-42ee-b2dc-a3d7b6828f23","Type":"ContainerDied","Data":"7e7793b6e1ef38a798923bc73c219a940941c5e9089015eda346a7411fa3fae1"} Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.141951 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.1418975 podStartE2EDuration="15.1418975s" podCreationTimestamp="2026-03-18 14:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:21:16.135561688 +0000 UTC m=+1277.449979683" watchObservedRunningTime="2026-03-18 14:21:16.1418975 +0000 UTC m=+1277.456315515" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.703493 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-6vpwc"] Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.704703 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6vpwc" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.714148 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-6vpwc"] Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.782189 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95-operator-scripts\") pod \"cloudkitty-db-create-6vpwc\" (UID: \"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95\") " pod="openstack/cloudkitty-db-create-6vpwc" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.782389 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k27x\" (UniqueName: \"kubernetes.io/projected/1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95-kube-api-access-9k27x\") pod \"cloudkitty-db-create-6vpwc\" (UID: \"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95\") " pod="openstack/cloudkitty-db-create-6vpwc" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.819725 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-dq884"] Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.823591 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dq884" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.848568 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-1934-account-create-update-d94f6"] Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.849811 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1934-account-create-update-d94f6" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.852160 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.861466 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dq884"] Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.887179 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95-operator-scripts\") pod \"cloudkitty-db-create-6vpwc\" (UID: \"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95\") " pod="openstack/cloudkitty-db-create-6vpwc" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.887460 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k27x\" (UniqueName: \"kubernetes.io/projected/1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95-kube-api-access-9k27x\") pod \"cloudkitty-db-create-6vpwc\" (UID: \"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95\") " pod="openstack/cloudkitty-db-create-6vpwc" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.888065 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95-operator-scripts\") pod \"cloudkitty-db-create-6vpwc\" (UID: \"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95\") " pod="openstack/cloudkitty-db-create-6vpwc" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.897438 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1934-account-create-update-d94f6"] Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.927564 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k27x\" (UniqueName: \"kubernetes.io/projected/1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95-kube-api-access-9k27x\") pod \"cloudkitty-db-create-6vpwc\" (UID: \"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95\") " pod="openstack/cloudkitty-db-create-6vpwc" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.980153 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d7ef-account-create-update-6xjvm"] Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.981417 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d7ef-account-create-update-6xjvm" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.986345 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.989397 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4c68a42-f073-4460-82f4-d5ddaaa26b05-operator-scripts\") pod \"cinder-db-create-dq884\" (UID: \"a4c68a42-f073-4460-82f4-d5ddaaa26b05\") " pod="openstack/cinder-db-create-dq884" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.989483 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71dfb945-64cc-4994-bd4e-591797fc6ca8-operator-scripts\") pod \"cinder-1934-account-create-update-d94f6\" (UID: \"71dfb945-64cc-4994-bd4e-591797fc6ca8\") " pod="openstack/cinder-1934-account-create-update-d94f6" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.989507 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfzbw\" (UniqueName: \"kubernetes.io/projected/a4c68a42-f073-4460-82f4-d5ddaaa26b05-kube-api-access-qfzbw\") pod \"cinder-db-create-dq884\" (UID: \"a4c68a42-f073-4460-82f4-d5ddaaa26b05\") " pod="openstack/cinder-db-create-dq884" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.989576 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp2nx\" (UniqueName: \"kubernetes.io/projected/71dfb945-64cc-4994-bd4e-591797fc6ca8-kube-api-access-fp2nx\") pod \"cinder-1934-account-create-update-d94f6\" (UID: \"71dfb945-64cc-4994-bd4e-591797fc6ca8\") " pod="openstack/cinder-1934-account-create-update-d94f6" Mar 18 14:21:16 crc kubenswrapper[4756]: I0318 14:21:16.997182 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d7ef-account-create-update-6xjvm"] Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.020157 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6vpwc" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.038556 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-8583-account-create-update-4lbfs"] Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.039644 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-8583-account-create-update-4lbfs" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.069876 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.081579 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-8583-account-create-update-4lbfs"] Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.097182 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8tg2\" (UniqueName: \"kubernetes.io/projected/da559c8c-1db0-49af-9b45-22af3a40eccf-kube-api-access-g8tg2\") pod \"barbican-d7ef-account-create-update-6xjvm\" (UID: \"da559c8c-1db0-49af-9b45-22af3a40eccf\") " pod="openstack/barbican-d7ef-account-create-update-6xjvm" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.097255 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71dfb945-64cc-4994-bd4e-591797fc6ca8-operator-scripts\") pod \"cinder-1934-account-create-update-d94f6\" (UID: \"71dfb945-64cc-4994-bd4e-591797fc6ca8\") " pod="openstack/cinder-1934-account-create-update-d94f6" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.097281 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfzbw\" (UniqueName: \"kubernetes.io/projected/a4c68a42-f073-4460-82f4-d5ddaaa26b05-kube-api-access-qfzbw\") pod \"cinder-db-create-dq884\" (UID: \"a4c68a42-f073-4460-82f4-d5ddaaa26b05\") " pod="openstack/cinder-db-create-dq884" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.097310 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da559c8c-1db0-49af-9b45-22af3a40eccf-operator-scripts\") pod \"barbican-d7ef-account-create-update-6xjvm\" (UID: \"da559c8c-1db0-49af-9b45-22af3a40eccf\") " pod="openstack/barbican-d7ef-account-create-update-6xjvm" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.097370 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp2nx\" (UniqueName: \"kubernetes.io/projected/71dfb945-64cc-4994-bd4e-591797fc6ca8-kube-api-access-fp2nx\") pod \"cinder-1934-account-create-update-d94f6\" (UID: \"71dfb945-64cc-4994-bd4e-591797fc6ca8\") " pod="openstack/cinder-1934-account-create-update-d94f6" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.097424 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4c68a42-f073-4460-82f4-d5ddaaa26b05-operator-scripts\") pod \"cinder-db-create-dq884\" (UID: \"a4c68a42-f073-4460-82f4-d5ddaaa26b05\") " pod="openstack/cinder-db-create-dq884" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.103567 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71dfb945-64cc-4994-bd4e-591797fc6ca8-operator-scripts\") pod \"cinder-1934-account-create-update-d94f6\" (UID: \"71dfb945-64cc-4994-bd4e-591797fc6ca8\") " pod="openstack/cinder-1934-account-create-update-d94f6" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.125720 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4c68a42-f073-4460-82f4-d5ddaaa26b05-operator-scripts\") pod \"cinder-db-create-dq884\" (UID: \"a4c68a42-f073-4460-82f4-d5ddaaa26b05\") " pod="openstack/cinder-db-create-dq884" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.169065 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfzbw\" (UniqueName: \"kubernetes.io/projected/a4c68a42-f073-4460-82f4-d5ddaaa26b05-kube-api-access-qfzbw\") pod \"cinder-db-create-dq884\" (UID: \"a4c68a42-f073-4460-82f4-d5ddaaa26b05\") " pod="openstack/cinder-db-create-dq884" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.173543 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp2nx\" (UniqueName: \"kubernetes.io/projected/71dfb945-64cc-4994-bd4e-591797fc6ca8-kube-api-access-fp2nx\") pod \"cinder-1934-account-create-update-d94f6\" (UID: \"71dfb945-64cc-4994-bd4e-591797fc6ca8\") " pod="openstack/cinder-1934-account-create-update-d94f6" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.175220 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-xtr5q"] Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.176447 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xtr5q" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.178850 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1934-account-create-update-d94f6" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.183560 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jstgl" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.183704 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.183803 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.198886 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.199909 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859f4c18-f94d-49bd-9287-ffad04cbe5d9-operator-scripts\") pod \"cloudkitty-8583-account-create-update-4lbfs\" (UID: \"859f4c18-f94d-49bd-9287-ffad04cbe5d9\") " pod="openstack/cloudkitty-8583-account-create-update-4lbfs" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.199943 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da559c8c-1db0-49af-9b45-22af3a40eccf-operator-scripts\") pod \"barbican-d7ef-account-create-update-6xjvm\" (UID: \"da559c8c-1db0-49af-9b45-22af3a40eccf\") " pod="openstack/barbican-d7ef-account-create-update-6xjvm" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.200058 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8tg2\" (UniqueName: \"kubernetes.io/projected/da559c8c-1db0-49af-9b45-22af3a40eccf-kube-api-access-g8tg2\") pod \"barbican-d7ef-account-create-update-6xjvm\" (UID: \"da559c8c-1db0-49af-9b45-22af3a40eccf\") " pod="openstack/barbican-d7ef-account-create-update-6xjvm" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.200093 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qtqb\" (UniqueName: \"kubernetes.io/projected/859f4c18-f94d-49bd-9287-ffad04cbe5d9-kube-api-access-6qtqb\") pod \"cloudkitty-8583-account-create-update-4lbfs\" (UID: \"859f4c18-f94d-49bd-9287-ffad04cbe5d9\") " pod="openstack/cloudkitty-8583-account-create-update-4lbfs" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.200789 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da559c8c-1db0-49af-9b45-22af3a40eccf-operator-scripts\") pod \"barbican-d7ef-account-create-update-6xjvm\" (UID: \"da559c8c-1db0-49af-9b45-22af3a40eccf\") " pod="openstack/barbican-d7ef-account-create-update-6xjvm" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.242646 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8tg2\" (UniqueName: \"kubernetes.io/projected/da559c8c-1db0-49af-9b45-22af3a40eccf-kube-api-access-g8tg2\") pod \"barbican-d7ef-account-create-update-6xjvm\" (UID: \"da559c8c-1db0-49af-9b45-22af3a40eccf\") " pod="openstack/barbican-d7ef-account-create-update-6xjvm" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.251403 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xtr5q"] Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.257690 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.257743 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.260336 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-jrvxc"] Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.261411 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jrvxc" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.270341 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.283225 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-2e41-account-create-update-8hh7d"] Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.284460 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2e41-account-create-update-8hh7d" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.296947 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jrvxc"] Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.300368 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.301252 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d7ef-account-create-update-6xjvm" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.303921 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qtqb\" (UniqueName: \"kubernetes.io/projected/859f4c18-f94d-49bd-9287-ffad04cbe5d9-kube-api-access-6qtqb\") pod \"cloudkitty-8583-account-create-update-4lbfs\" (UID: \"859f4c18-f94d-49bd-9287-ffad04cbe5d9\") " pod="openstack/cloudkitty-8583-account-create-update-4lbfs" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.303971 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqj9b\" (UniqueName: \"kubernetes.io/projected/fcdc6f55-de64-4698-9c24-35d42eca014c-kube-api-access-lqj9b\") pod \"keystone-db-sync-xtr5q\" (UID: \"fcdc6f55-de64-4698-9c24-35d42eca014c\") " pod="openstack/keystone-db-sync-xtr5q" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.303999 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdc6f55-de64-4698-9c24-35d42eca014c-config-data\") pod \"keystone-db-sync-xtr5q\" (UID: \"fcdc6f55-de64-4698-9c24-35d42eca014c\") " pod="openstack/keystone-db-sync-xtr5q" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.304018 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdc6f55-de64-4698-9c24-35d42eca014c-combined-ca-bundle\") pod \"keystone-db-sync-xtr5q\" (UID: \"fcdc6f55-de64-4698-9c24-35d42eca014c\") " pod="openstack/keystone-db-sync-xtr5q" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.304063 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859f4c18-f94d-49bd-9287-ffad04cbe5d9-operator-scripts\") pod \"cloudkitty-8583-account-create-update-4lbfs\" (UID: \"859f4c18-f94d-49bd-9287-ffad04cbe5d9\") " pod="openstack/cloudkitty-8583-account-create-update-4lbfs" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.307450 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859f4c18-f94d-49bd-9287-ffad04cbe5d9-operator-scripts\") pod \"cloudkitty-8583-account-create-update-4lbfs\" (UID: \"859f4c18-f94d-49bd-9287-ffad04cbe5d9\") " pod="openstack/cloudkitty-8583-account-create-update-4lbfs" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.309560 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2e41-account-create-update-8hh7d"] Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.349490 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-f6cr8"] Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.350740 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-f6cr8" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.363685 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qtqb\" (UniqueName: \"kubernetes.io/projected/859f4c18-f94d-49bd-9287-ffad04cbe5d9-kube-api-access-6qtqb\") pod \"cloudkitty-8583-account-create-update-4lbfs\" (UID: \"859f4c18-f94d-49bd-9287-ffad04cbe5d9\") " pod="openstack/cloudkitty-8583-account-create-update-4lbfs" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.378234 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-f6cr8"] Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.410357 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c-operator-scripts\") pod \"neutron-2e41-account-create-update-8hh7d\" (UID: \"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c\") " pod="openstack/neutron-2e41-account-create-update-8hh7d" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.410415 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q557\" (UniqueName: \"kubernetes.io/projected/622c7c3a-8ac1-4286-8bcf-b1444608c489-kube-api-access-2q557\") pod \"neutron-db-create-f6cr8\" (UID: \"622c7c3a-8ac1-4286-8bcf-b1444608c489\") " pod="openstack/neutron-db-create-f6cr8" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.410465 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlcf5\" (UniqueName: \"kubernetes.io/projected/0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c-kube-api-access-hlcf5\") pod \"neutron-2e41-account-create-update-8hh7d\" (UID: \"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c\") " pod="openstack/neutron-2e41-account-create-update-8hh7d" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.410513 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzdjh\" (UniqueName: \"kubernetes.io/projected/45537606-07d7-49ee-abed-7aab10e9deab-kube-api-access-xzdjh\") pod \"barbican-db-create-jrvxc\" (UID: \"45537606-07d7-49ee-abed-7aab10e9deab\") " pod="openstack/barbican-db-create-jrvxc" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.410619 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/622c7c3a-8ac1-4286-8bcf-b1444608c489-operator-scripts\") pod \"neutron-db-create-f6cr8\" (UID: \"622c7c3a-8ac1-4286-8bcf-b1444608c489\") " pod="openstack/neutron-db-create-f6cr8" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.410679 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45537606-07d7-49ee-abed-7aab10e9deab-operator-scripts\") pod \"barbican-db-create-jrvxc\" (UID: \"45537606-07d7-49ee-abed-7aab10e9deab\") " pod="openstack/barbican-db-create-jrvxc" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.410728 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqj9b\" (UniqueName: \"kubernetes.io/projected/fcdc6f55-de64-4698-9c24-35d42eca014c-kube-api-access-lqj9b\") pod \"keystone-db-sync-xtr5q\" (UID: \"fcdc6f55-de64-4698-9c24-35d42eca014c\") " pod="openstack/keystone-db-sync-xtr5q" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.410784 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdc6f55-de64-4698-9c24-35d42eca014c-config-data\") pod \"keystone-db-sync-xtr5q\" (UID: \"fcdc6f55-de64-4698-9c24-35d42eca014c\") " pod="openstack/keystone-db-sync-xtr5q" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.410814 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdc6f55-de64-4698-9c24-35d42eca014c-combined-ca-bundle\") pod \"keystone-db-sync-xtr5q\" (UID: \"fcdc6f55-de64-4698-9c24-35d42eca014c\") " pod="openstack/keystone-db-sync-xtr5q" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.425606 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdc6f55-de64-4698-9c24-35d42eca014c-config-data\") pod \"keystone-db-sync-xtr5q\" (UID: \"fcdc6f55-de64-4698-9c24-35d42eca014c\") " pod="openstack/keystone-db-sync-xtr5q" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.428978 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdc6f55-de64-4698-9c24-35d42eca014c-combined-ca-bundle\") pod \"keystone-db-sync-xtr5q\" (UID: \"fcdc6f55-de64-4698-9c24-35d42eca014c\") " pod="openstack/keystone-db-sync-xtr5q" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.440001 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dq884" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.441550 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqj9b\" (UniqueName: \"kubernetes.io/projected/fcdc6f55-de64-4698-9c24-35d42eca014c-kube-api-access-lqj9b\") pod \"keystone-db-sync-xtr5q\" (UID: \"fcdc6f55-de64-4698-9c24-35d42eca014c\") " pod="openstack/keystone-db-sync-xtr5q" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.514827 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlcf5\" (UniqueName: \"kubernetes.io/projected/0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c-kube-api-access-hlcf5\") pod \"neutron-2e41-account-create-update-8hh7d\" (UID: \"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c\") " pod="openstack/neutron-2e41-account-create-update-8hh7d" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.514876 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzdjh\" (UniqueName: \"kubernetes.io/projected/45537606-07d7-49ee-abed-7aab10e9deab-kube-api-access-xzdjh\") pod \"barbican-db-create-jrvxc\" (UID: \"45537606-07d7-49ee-abed-7aab10e9deab\") " pod="openstack/barbican-db-create-jrvxc" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.514972 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/622c7c3a-8ac1-4286-8bcf-b1444608c489-operator-scripts\") pod \"neutron-db-create-f6cr8\" (UID: \"622c7c3a-8ac1-4286-8bcf-b1444608c489\") " pod="openstack/neutron-db-create-f6cr8" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.515004 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45537606-07d7-49ee-abed-7aab10e9deab-operator-scripts\") pod \"barbican-db-create-jrvxc\" (UID: \"45537606-07d7-49ee-abed-7aab10e9deab\") " pod="openstack/barbican-db-create-jrvxc" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.515079 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c-operator-scripts\") pod \"neutron-2e41-account-create-update-8hh7d\" (UID: \"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c\") " pod="openstack/neutron-2e41-account-create-update-8hh7d" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.515097 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q557\" (UniqueName: \"kubernetes.io/projected/622c7c3a-8ac1-4286-8bcf-b1444608c489-kube-api-access-2q557\") pod \"neutron-db-create-f6cr8\" (UID: \"622c7c3a-8ac1-4286-8bcf-b1444608c489\") " pod="openstack/neutron-db-create-f6cr8" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.516502 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/622c7c3a-8ac1-4286-8bcf-b1444608c489-operator-scripts\") pod \"neutron-db-create-f6cr8\" (UID: \"622c7c3a-8ac1-4286-8bcf-b1444608c489\") " pod="openstack/neutron-db-create-f6cr8" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.517182 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45537606-07d7-49ee-abed-7aab10e9deab-operator-scripts\") pod \"barbican-db-create-jrvxc\" (UID: \"45537606-07d7-49ee-abed-7aab10e9deab\") " pod="openstack/barbican-db-create-jrvxc" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.518086 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c-operator-scripts\") pod \"neutron-2e41-account-create-update-8hh7d\" (UID: \"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c\") " pod="openstack/neutron-2e41-account-create-update-8hh7d" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.540657 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q557\" (UniqueName: \"kubernetes.io/projected/622c7c3a-8ac1-4286-8bcf-b1444608c489-kube-api-access-2q557\") pod \"neutron-db-create-f6cr8\" (UID: \"622c7c3a-8ac1-4286-8bcf-b1444608c489\") " pod="openstack/neutron-db-create-f6cr8" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.548407 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzdjh\" (UniqueName: \"kubernetes.io/projected/45537606-07d7-49ee-abed-7aab10e9deab-kube-api-access-xzdjh\") pod \"barbican-db-create-jrvxc\" (UID: \"45537606-07d7-49ee-abed-7aab10e9deab\") " pod="openstack/barbican-db-create-jrvxc" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.569026 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-8583-account-create-update-4lbfs" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.587209 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xtr5q" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.591621 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlcf5\" (UniqueName: \"kubernetes.io/projected/0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c-kube-api-access-hlcf5\") pod \"neutron-2e41-account-create-update-8hh7d\" (UID: \"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c\") " pod="openstack/neutron-2e41-account-create-update-8hh7d" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.609574 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jrvxc" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.648635 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2e41-account-create-update-8hh7d" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.736704 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-f6cr8" Mar 18 14:21:17 crc kubenswrapper[4756]: I0318 14:21:17.776942 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-6vpwc"] Mar 18 14:21:17 crc kubenswrapper[4756]: W0318 14:21:17.810303 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cf1d7f9_fba2_4ad5_a5c0_f92452e6ab95.slice/crio-c76e13c1a9c06291df5e9046693a7d52ac4363fb3dd5440b17793846bbc238b3 WatchSource:0}: Error finding container c76e13c1a9c06291df5e9046693a7d52ac4363fb3dd5440b17793846bbc238b3: Status 404 returned error can't find the container with id c76e13c1a9c06291df5e9046693a7d52ac4363fb3dd5440b17793846bbc238b3 Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.039467 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.043655 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d7ef-account-create-update-6xjvm"] Mar 18 14:21:18 crc kubenswrapper[4756]: W0318 14:21:18.052849 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda559c8c_1db0_49af_9b45_22af3a40eccf.slice/crio-9af0c9979db11aba85a335159ec9b149ec2079b9066e01c2e3789cbb255139af WatchSource:0}: Error finding container 9af0c9979db11aba85a335159ec9b149ec2079b9066e01c2e3789cbb255139af: Status 404 returned error can't find the container with id 9af0c9979db11aba85a335159ec9b149ec2079b9066e01c2e3789cbb255139af Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.132017 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-run-ovn\") pod \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.132082 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-log-ovn\") pod \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.132210 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12e57d08-4793-42ee-b2dc-a3d7b6828f23-scripts\") pod \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.132303 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-run\") pod \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.132351 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12e57d08-4793-42ee-b2dc-a3d7b6828f23-additional-scripts\") pod \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.132378 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbdzw\" (UniqueName: \"kubernetes.io/projected/12e57d08-4793-42ee-b2dc-a3d7b6828f23-kube-api-access-wbdzw\") pod \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\" (UID: \"12e57d08-4793-42ee-b2dc-a3d7b6828f23\") " Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.132597 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-run" (OuterVolumeSpecName: "var-run") pod "12e57d08-4793-42ee-b2dc-a3d7b6828f23" (UID: "12e57d08-4793-42ee-b2dc-a3d7b6828f23"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.132855 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "12e57d08-4793-42ee-b2dc-a3d7b6828f23" (UID: "12e57d08-4793-42ee-b2dc-a3d7b6828f23"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.132886 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "12e57d08-4793-42ee-b2dc-a3d7b6828f23" (UID: "12e57d08-4793-42ee-b2dc-a3d7b6828f23"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.133131 4756 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.133148 4756 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.133158 4756 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12e57d08-4793-42ee-b2dc-a3d7b6828f23-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.133482 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e57d08-4793-42ee-b2dc-a3d7b6828f23-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "12e57d08-4793-42ee-b2dc-a3d7b6828f23" (UID: "12e57d08-4793-42ee-b2dc-a3d7b6828f23"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.133536 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e57d08-4793-42ee-b2dc-a3d7b6828f23-scripts" (OuterVolumeSpecName: "scripts") pod "12e57d08-4793-42ee-b2dc-a3d7b6828f23" (UID: "12e57d08-4793-42ee-b2dc-a3d7b6828f23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.138366 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e57d08-4793-42ee-b2dc-a3d7b6828f23-kube-api-access-wbdzw" (OuterVolumeSpecName: "kube-api-access-wbdzw") pod "12e57d08-4793-42ee-b2dc-a3d7b6828f23" (UID: "12e57d08-4793-42ee-b2dc-a3d7b6828f23"). InnerVolumeSpecName "kube-api-access-wbdzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.182576 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6vpwc" event={"ID":"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95","Type":"ContainerStarted","Data":"d075f3a42af1fadbd910d2c6f4615b217c87e1f13bc186fc591aec1a99858977"} Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.183057 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6vpwc" event={"ID":"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95","Type":"ContainerStarted","Data":"c76e13c1a9c06291df5e9046693a7d52ac4363fb3dd5440b17793846bbc238b3"} Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.183938 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d7ef-account-create-update-6xjvm" event={"ID":"da559c8c-1db0-49af-9b45-22af3a40eccf","Type":"ContainerStarted","Data":"9af0c9979db11aba85a335159ec9b149ec2079b9066e01c2e3789cbb255139af"} Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.186393 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r6s8c-config-fxq89" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.186781 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r6s8c-config-fxq89" event={"ID":"12e57d08-4793-42ee-b2dc-a3d7b6828f23","Type":"ContainerDied","Data":"731fb619fedc30e949563fb4352f8a38f67ae6da11bc11452293eb4e49d1caca"} Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.186809 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="731fb619fedc30e949563fb4352f8a38f67ae6da11bc11452293eb4e49d1caca" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.196864 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.204923 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-r6s8c-config-fxq89"] Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.222680 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-r6s8c-config-fxq89"] Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.228735 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-create-6vpwc" podStartSLOduration=2.228716906 podStartE2EDuration="2.228716906s" podCreationTimestamp="2026-03-18 14:21:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:21:18.211643103 +0000 UTC m=+1279.526061078" watchObservedRunningTime="2026-03-18 14:21:18.228716906 +0000 UTC m=+1279.543134881" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.235982 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12e57d08-4793-42ee-b2dc-a3d7b6828f23-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.236011 4756 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12e57d08-4793-42ee-b2dc-a3d7b6828f23-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.236021 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbdzw\" (UniqueName: \"kubernetes.io/projected/12e57d08-4793-42ee-b2dc-a3d7b6828f23-kube-api-access-wbdzw\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.240909 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dq884"] Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.300232 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1934-account-create-update-d94f6"] Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.601040 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jrvxc"] Mar 18 14:21:18 crc kubenswrapper[4756]: W0318 14:21:18.604250 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45537606_07d7_49ee_abed_7aab10e9deab.slice/crio-861a57c729c03892d97a06ac637a3022640edc642e7b9a0e1b396f35b759023d WatchSource:0}: Error finding container 861a57c729c03892d97a06ac637a3022640edc642e7b9a0e1b396f35b759023d: Status 404 returned error can't find the container with id 861a57c729c03892d97a06ac637a3022640edc642e7b9a0e1b396f35b759023d Mar 18 14:21:18 crc kubenswrapper[4756]: W0318 14:21:18.606619 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcdc6f55_de64_4698_9c24_35d42eca014c.slice/crio-117daefdda159ebab5642ab8e4956839d154110c9dbe1183490d497b177b3a96 WatchSource:0}: Error finding container 117daefdda159ebab5642ab8e4956839d154110c9dbe1183490d497b177b3a96: Status 404 returned error can't find the container with id 117daefdda159ebab5642ab8e4956839d154110c9dbe1183490d497b177b3a96 Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.610043 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xtr5q"] Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.616626 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-8583-account-create-update-4lbfs"] Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.632202 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2e41-account-create-update-8hh7d"] Mar 18 14:21:18 crc kubenswrapper[4756]: W0318 14:21:18.649892 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f3af1c0_3ab7_43b6_bcd6_7de7fcbf6a0c.slice/crio-a81c56868936e41d8fda16f38e1b0f897fae25fe2e8de1896c0ff9858c5a0424 WatchSource:0}: Error finding container a81c56868936e41d8fda16f38e1b0f897fae25fe2e8de1896c0ff9858c5a0424: Status 404 returned error can't find the container with id a81c56868936e41d8fda16f38e1b0f897fae25fe2e8de1896c0ff9858c5a0424 Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.654794 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-f6cr8"] Mar 18 14:21:18 crc kubenswrapper[4756]: I0318 14:21:18.844938 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.219259 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-8583-account-create-update-4lbfs" event={"ID":"859f4c18-f94d-49bd-9287-ffad04cbe5d9","Type":"ContainerStarted","Data":"de0d5819c9180839002918cdaf9bfe6b093a031a737e6272c62c887f2f8a5717"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.219312 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-8583-account-create-update-4lbfs" event={"ID":"859f4c18-f94d-49bd-9287-ffad04cbe5d9","Type":"ContainerStarted","Data":"6ebbf6ca9a1a1a28d5df99c804d247eafc762620947a6d506f609668799a8589"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.222355 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-f6cr8" event={"ID":"622c7c3a-8ac1-4286-8bcf-b1444608c489","Type":"ContainerStarted","Data":"e21986a3f786c4c31e558ecbca97a99d022702e95cc250625d4368b344e0a9da"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.222387 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-f6cr8" event={"ID":"622c7c3a-8ac1-4286-8bcf-b1444608c489","Type":"ContainerStarted","Data":"1c8c379cb4b55cca319e16bf6bf0c0d70f42eba610bf26f7f71f2a8c7a0a957a"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.223839 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xtr5q" event={"ID":"fcdc6f55-de64-4698-9c24-35d42eca014c","Type":"ContainerStarted","Data":"117daefdda159ebab5642ab8e4956839d154110c9dbe1183490d497b177b3a96"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.226488 4756 generic.go:334] "Generic (PLEG): container finished" podID="da559c8c-1db0-49af-9b45-22af3a40eccf" containerID="7c915df2dee3f4663a41dcd9a86ab2b7f72e2623788c2983f916df4f1db627b8" exitCode=0 Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.226555 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d7ef-account-create-update-6xjvm" event={"ID":"da559c8c-1db0-49af-9b45-22af3a40eccf","Type":"ContainerDied","Data":"7c915df2dee3f4663a41dcd9a86ab2b7f72e2623788c2983f916df4f1db627b8"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.228343 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jrvxc" event={"ID":"45537606-07d7-49ee-abed-7aab10e9deab","Type":"ContainerStarted","Data":"861a57c729c03892d97a06ac637a3022640edc642e7b9a0e1b396f35b759023d"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.230376 4756 generic.go:334] "Generic (PLEG): container finished" podID="a4c68a42-f073-4460-82f4-d5ddaaa26b05" containerID="1cf4bcbe151782e076c6abb900c4dcdc8fcb725ff66d62798995bbf35698cbab" exitCode=0 Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.230436 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dq884" event={"ID":"a4c68a42-f073-4460-82f4-d5ddaaa26b05","Type":"ContainerDied","Data":"1cf4bcbe151782e076c6abb900c4dcdc8fcb725ff66d62798995bbf35698cbab"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.230458 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dq884" event={"ID":"a4c68a42-f073-4460-82f4-d5ddaaa26b05","Type":"ContainerStarted","Data":"4c6583ab2cf310186ebd2b3f8fa6b64d3c96f29a9b1a730379acf4a01d664dfd"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.235549 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-8583-account-create-update-4lbfs" podStartSLOduration=2.235531709 podStartE2EDuration="2.235531709s" podCreationTimestamp="2026-03-18 14:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:21:19.232389624 +0000 UTC m=+1280.546807599" watchObservedRunningTime="2026-03-18 14:21:19.235531709 +0000 UTC m=+1280.549949684" Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.239055 4756 generic.go:334] "Generic (PLEG): container finished" podID="1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95" containerID="d075f3a42af1fadbd910d2c6f4615b217c87e1f13bc186fc591aec1a99858977" exitCode=0 Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.239194 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6vpwc" event={"ID":"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95","Type":"ContainerDied","Data":"d075f3a42af1fadbd910d2c6f4615b217c87e1f13bc186fc591aec1a99858977"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.246313 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2e41-account-create-update-8hh7d" event={"ID":"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c","Type":"ContainerStarted","Data":"bd9a06e07640582b0cd16a84f32400dfa07f0a04141d345279c0f4ef3b7064d5"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.246355 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2e41-account-create-update-8hh7d" event={"ID":"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c","Type":"ContainerStarted","Data":"a81c56868936e41d8fda16f38e1b0f897fae25fe2e8de1896c0ff9858c5a0424"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.255420 4756 generic.go:334] "Generic (PLEG): container finished" podID="71dfb945-64cc-4994-bd4e-591797fc6ca8" containerID="674550c08e0f9236e4da0f39dfa646f5c61785b2943b5b7e53e7ca5f2728776c" exitCode=0 Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.256267 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1934-account-create-update-d94f6" event={"ID":"71dfb945-64cc-4994-bd4e-591797fc6ca8","Type":"ContainerDied","Data":"674550c08e0f9236e4da0f39dfa646f5c61785b2943b5b7e53e7ca5f2728776c"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.256310 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1934-account-create-update-d94f6" event={"ID":"71dfb945-64cc-4994-bd4e-591797fc6ca8","Type":"ContainerStarted","Data":"35d3ac4560e7cecaab6f9bf6d6c8b6d0fdd9e1748ad160e5a906c8ff488f4156"} Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.282852 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-f6cr8" podStartSLOduration=2.282825278 podStartE2EDuration="2.282825278s" podCreationTimestamp="2026-03-18 14:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:21:19.2496265 +0000 UTC m=+1280.564044485" watchObservedRunningTime="2026-03-18 14:21:19.282825278 +0000 UTC m=+1280.597243263" Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.321810 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-jrvxc" podStartSLOduration=2.321791101 podStartE2EDuration="2.321791101s" podCreationTimestamp="2026-03-18 14:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:21:19.303536897 +0000 UTC m=+1280.617954892" watchObservedRunningTime="2026-03-18 14:21:19.321791101 +0000 UTC m=+1280.636209076" Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.330096 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12e57d08-4793-42ee-b2dc-a3d7b6828f23" path="/var/lib/kubelet/pods/12e57d08-4793-42ee-b2dc-a3d7b6828f23/volumes" Mar 18 14:21:19 crc kubenswrapper[4756]: I0318 14:21:19.359468 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-2e41-account-create-update-8hh7d" podStartSLOduration=2.359447599 podStartE2EDuration="2.359447599s" podCreationTimestamp="2026-03-18 14:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:21:19.355484652 +0000 UTC m=+1280.669902657" watchObservedRunningTime="2026-03-18 14:21:19.359447599 +0000 UTC m=+1280.673865574" Mar 18 14:21:20 crc kubenswrapper[4756]: I0318 14:21:20.272872 4756 generic.go:334] "Generic (PLEG): container finished" podID="859f4c18-f94d-49bd-9287-ffad04cbe5d9" containerID="de0d5819c9180839002918cdaf9bfe6b093a031a737e6272c62c887f2f8a5717" exitCode=0 Mar 18 14:21:20 crc kubenswrapper[4756]: I0318 14:21:20.273247 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-8583-account-create-update-4lbfs" event={"ID":"859f4c18-f94d-49bd-9287-ffad04cbe5d9","Type":"ContainerDied","Data":"de0d5819c9180839002918cdaf9bfe6b093a031a737e6272c62c887f2f8a5717"} Mar 18 14:21:20 crc kubenswrapper[4756]: I0318 14:21:20.276455 4756 generic.go:334] "Generic (PLEG): container finished" podID="622c7c3a-8ac1-4286-8bcf-b1444608c489" containerID="e21986a3f786c4c31e558ecbca97a99d022702e95cc250625d4368b344e0a9da" exitCode=0 Mar 18 14:21:20 crc kubenswrapper[4756]: I0318 14:21:20.276524 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-f6cr8" event={"ID":"622c7c3a-8ac1-4286-8bcf-b1444608c489","Type":"ContainerDied","Data":"e21986a3f786c4c31e558ecbca97a99d022702e95cc250625d4368b344e0a9da"} Mar 18 14:21:20 crc kubenswrapper[4756]: I0318 14:21:20.278423 4756 generic.go:334] "Generic (PLEG): container finished" podID="0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c" containerID="bd9a06e07640582b0cd16a84f32400dfa07f0a04141d345279c0f4ef3b7064d5" exitCode=0 Mar 18 14:21:20 crc kubenswrapper[4756]: I0318 14:21:20.278482 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2e41-account-create-update-8hh7d" event={"ID":"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c","Type":"ContainerDied","Data":"bd9a06e07640582b0cd16a84f32400dfa07f0a04141d345279c0f4ef3b7064d5"} Mar 18 14:21:20 crc kubenswrapper[4756]: I0318 14:21:20.280217 4756 generic.go:334] "Generic (PLEG): container finished" podID="45537606-07d7-49ee-abed-7aab10e9deab" containerID="e5c2a81656bfe4293055d4c913767388205276cef98d6c9857c9a3afdff264d0" exitCode=0 Mar 18 14:21:20 crc kubenswrapper[4756]: I0318 14:21:20.281349 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jrvxc" event={"ID":"45537606-07d7-49ee-abed-7aab10e9deab","Type":"ContainerDied","Data":"e5c2a81656bfe4293055d4c913767388205276cef98d6c9857c9a3afdff264d0"} Mar 18 14:21:20 crc kubenswrapper[4756]: I0318 14:21:20.683483 4756 scope.go:117] "RemoveContainer" containerID="799ebddf58392ceaef15766661e99efd77cb4050930d43e64dc63f69668d1500" Mar 18 14:21:21 crc kubenswrapper[4756]: I0318 14:21:21.765789 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:21 crc kubenswrapper[4756]: I0318 14:21:21.831441 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-tw5k4"] Mar 18 14:21:21 crc kubenswrapper[4756]: I0318 14:21:21.832662 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-tw5k4" podUID="5feba85a-c965-4c24-a732-39ccc0ddbcf1" containerName="dnsmasq-dns" containerID="cri-o://c8105ff4eb910d4758dd4fae4a3311f0bc76a9446cfdcfbdf17d461a45e4b10a" gracePeriod=10 Mar 18 14:21:22 crc kubenswrapper[4756]: I0318 14:21:22.302281 4756 generic.go:334] "Generic (PLEG): container finished" podID="5feba85a-c965-4c24-a732-39ccc0ddbcf1" containerID="c8105ff4eb910d4758dd4fae4a3311f0bc76a9446cfdcfbdf17d461a45e4b10a" exitCode=0 Mar 18 14:21:22 crc kubenswrapper[4756]: I0318 14:21:22.302336 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-tw5k4" event={"ID":"5feba85a-c965-4c24-a732-39ccc0ddbcf1","Type":"ContainerDied","Data":"c8105ff4eb910d4758dd4fae4a3311f0bc76a9446cfdcfbdf17d461a45e4b10a"} Mar 18 14:21:23 crc kubenswrapper[4756]: E0318 14:21:23.218459 4756 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.311839 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jrvxc" event={"ID":"45537606-07d7-49ee-abed-7aab10e9deab","Type":"ContainerDied","Data":"861a57c729c03892d97a06ac637a3022640edc642e7b9a0e1b396f35b759023d"} Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.311876 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="861a57c729c03892d97a06ac637a3022640edc642e7b9a0e1b396f35b759023d" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.313908 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1934-account-create-update-d94f6" event={"ID":"71dfb945-64cc-4994-bd4e-591797fc6ca8","Type":"ContainerDied","Data":"35d3ac4560e7cecaab6f9bf6d6c8b6d0fdd9e1748ad160e5a906c8ff488f4156"} Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.313951 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35d3ac4560e7cecaab6f9bf6d6c8b6d0fdd9e1748ad160e5a906c8ff488f4156" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.324532 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-8583-account-create-update-4lbfs" event={"ID":"859f4c18-f94d-49bd-9287-ffad04cbe5d9","Type":"ContainerDied","Data":"6ebbf6ca9a1a1a28d5df99c804d247eafc762620947a6d506f609668799a8589"} Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.324569 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ebbf6ca9a1a1a28d5df99c804d247eafc762620947a6d506f609668799a8589" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.324581 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d7ef-account-create-update-6xjvm" event={"ID":"da559c8c-1db0-49af-9b45-22af3a40eccf","Type":"ContainerDied","Data":"9af0c9979db11aba85a335159ec9b149ec2079b9066e01c2e3789cbb255139af"} Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.324590 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9af0c9979db11aba85a335159ec9b149ec2079b9066e01c2e3789cbb255139af" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.324600 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dq884" event={"ID":"a4c68a42-f073-4460-82f4-d5ddaaa26b05","Type":"ContainerDied","Data":"4c6583ab2cf310186ebd2b3f8fa6b64d3c96f29a9b1a730379acf4a01d664dfd"} Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.324608 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c6583ab2cf310186ebd2b3f8fa6b64d3c96f29a9b1a730379acf4a01d664dfd" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.324616 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6vpwc" event={"ID":"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95","Type":"ContainerDied","Data":"c76e13c1a9c06291df5e9046693a7d52ac4363fb3dd5440b17793846bbc238b3"} Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.324624 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c76e13c1a9c06291df5e9046693a7d52ac4363fb3dd5440b17793846bbc238b3" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.324633 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-f6cr8" event={"ID":"622c7c3a-8ac1-4286-8bcf-b1444608c489","Type":"ContainerDied","Data":"1c8c379cb4b55cca319e16bf6bf0c0d70f42eba610bf26f7f71f2a8c7a0a957a"} Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.324641 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c8c379cb4b55cca319e16bf6bf0c0d70f42eba610bf26f7f71f2a8c7a0a957a" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.324648 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2e41-account-create-update-8hh7d" event={"ID":"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c","Type":"ContainerDied","Data":"a81c56868936e41d8fda16f38e1b0f897fae25fe2e8de1896c0ff9858c5a0424"} Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.324655 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a81c56868936e41d8fda16f38e1b0f897fae25fe2e8de1896c0ff9858c5a0424" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.373779 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jrvxc" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.396637 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dq884" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.405564 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6vpwc" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.424838 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1934-account-create-update-d94f6" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.438037 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-8583-account-create-update-4lbfs" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.447926 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2e41-account-create-update-8hh7d" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.456922 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d7ef-account-create-update-6xjvm" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.463509 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71dfb945-64cc-4994-bd4e-591797fc6ca8-operator-scripts\") pod \"71dfb945-64cc-4994-bd4e-591797fc6ca8\" (UID: \"71dfb945-64cc-4994-bd4e-591797fc6ca8\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.463555 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfzbw\" (UniqueName: \"kubernetes.io/projected/a4c68a42-f073-4460-82f4-d5ddaaa26b05-kube-api-access-qfzbw\") pod \"a4c68a42-f073-4460-82f4-d5ddaaa26b05\" (UID: \"a4c68a42-f073-4460-82f4-d5ddaaa26b05\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.463626 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k27x\" (UniqueName: \"kubernetes.io/projected/1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95-kube-api-access-9k27x\") pod \"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95\" (UID: \"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.463756 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4c68a42-f073-4460-82f4-d5ddaaa26b05-operator-scripts\") pod \"a4c68a42-f073-4460-82f4-d5ddaaa26b05\" (UID: \"a4c68a42-f073-4460-82f4-d5ddaaa26b05\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.463808 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95-operator-scripts\") pod \"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95\" (UID: \"1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.463900 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45537606-07d7-49ee-abed-7aab10e9deab-operator-scripts\") pod \"45537606-07d7-49ee-abed-7aab10e9deab\" (UID: \"45537606-07d7-49ee-abed-7aab10e9deab\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.463941 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzdjh\" (UniqueName: \"kubernetes.io/projected/45537606-07d7-49ee-abed-7aab10e9deab-kube-api-access-xzdjh\") pod \"45537606-07d7-49ee-abed-7aab10e9deab\" (UID: \"45537606-07d7-49ee-abed-7aab10e9deab\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.463971 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp2nx\" (UniqueName: \"kubernetes.io/projected/71dfb945-64cc-4994-bd4e-591797fc6ca8-kube-api-access-fp2nx\") pod \"71dfb945-64cc-4994-bd4e-591797fc6ca8\" (UID: \"71dfb945-64cc-4994-bd4e-591797fc6ca8\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.464449 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95" (UID: "1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.464465 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c68a42-f073-4460-82f4-d5ddaaa26b05-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4c68a42-f073-4460-82f4-d5ddaaa26b05" (UID: "a4c68a42-f073-4460-82f4-d5ddaaa26b05"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.464500 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71dfb945-64cc-4994-bd4e-591797fc6ca8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71dfb945-64cc-4994-bd4e-591797fc6ca8" (UID: "71dfb945-64cc-4994-bd4e-591797fc6ca8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.466019 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45537606-07d7-49ee-abed-7aab10e9deab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45537606-07d7-49ee-abed-7aab10e9deab" (UID: "45537606-07d7-49ee-abed-7aab10e9deab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.471852 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c68a42-f073-4460-82f4-d5ddaaa26b05-kube-api-access-qfzbw" (OuterVolumeSpecName: "kube-api-access-qfzbw") pod "a4c68a42-f073-4460-82f4-d5ddaaa26b05" (UID: "a4c68a42-f073-4460-82f4-d5ddaaa26b05"). InnerVolumeSpecName "kube-api-access-qfzbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.473728 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-f6cr8" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.473738 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95-kube-api-access-9k27x" (OuterVolumeSpecName: "kube-api-access-9k27x") pod "1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95" (UID: "1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95"). InnerVolumeSpecName "kube-api-access-9k27x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.478087 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45537606-07d7-49ee-abed-7aab10e9deab-kube-api-access-xzdjh" (OuterVolumeSpecName: "kube-api-access-xzdjh") pod "45537606-07d7-49ee-abed-7aab10e9deab" (UID: "45537606-07d7-49ee-abed-7aab10e9deab"). InnerVolumeSpecName "kube-api-access-xzdjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.481859 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71dfb945-64cc-4994-bd4e-591797fc6ca8-kube-api-access-fp2nx" (OuterVolumeSpecName: "kube-api-access-fp2nx") pod "71dfb945-64cc-4994-bd4e-591797fc6ca8" (UID: "71dfb945-64cc-4994-bd4e-591797fc6ca8"). InnerVolumeSpecName "kube-api-access-fp2nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.483856 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.565951 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qtqb\" (UniqueName: \"kubernetes.io/projected/859f4c18-f94d-49bd-9287-ffad04cbe5d9-kube-api-access-6qtqb\") pod \"859f4c18-f94d-49bd-9287-ffad04cbe5d9\" (UID: \"859f4c18-f94d-49bd-9287-ffad04cbe5d9\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566020 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-config\") pod \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566075 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c-operator-scripts\") pod \"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c\" (UID: \"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566142 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlcf5\" (UniqueName: \"kubernetes.io/projected/0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c-kube-api-access-hlcf5\") pod \"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c\" (UID: \"0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566189 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8tg2\" (UniqueName: \"kubernetes.io/projected/da559c8c-1db0-49af-9b45-22af3a40eccf-kube-api-access-g8tg2\") pod \"da559c8c-1db0-49af-9b45-22af3a40eccf\" (UID: \"da559c8c-1db0-49af-9b45-22af3a40eccf\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566219 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-ovsdbserver-sb\") pod \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566253 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-ovsdbserver-nb\") pod \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566278 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859f4c18-f94d-49bd-9287-ffad04cbe5d9-operator-scripts\") pod \"859f4c18-f94d-49bd-9287-ffad04cbe5d9\" (UID: \"859f4c18-f94d-49bd-9287-ffad04cbe5d9\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566305 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da559c8c-1db0-49af-9b45-22af3a40eccf-operator-scripts\") pod \"da559c8c-1db0-49af-9b45-22af3a40eccf\" (UID: \"da559c8c-1db0-49af-9b45-22af3a40eccf\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566326 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q557\" (UniqueName: \"kubernetes.io/projected/622c7c3a-8ac1-4286-8bcf-b1444608c489-kube-api-access-2q557\") pod \"622c7c3a-8ac1-4286-8bcf-b1444608c489\" (UID: \"622c7c3a-8ac1-4286-8bcf-b1444608c489\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566373 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/622c7c3a-8ac1-4286-8bcf-b1444608c489-operator-scripts\") pod \"622c7c3a-8ac1-4286-8bcf-b1444608c489\" (UID: \"622c7c3a-8ac1-4286-8bcf-b1444608c489\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566396 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxppw\" (UniqueName: \"kubernetes.io/projected/5feba85a-c965-4c24-a732-39ccc0ddbcf1-kube-api-access-jxppw\") pod \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566420 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-dns-svc\") pod \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\" (UID: \"5feba85a-c965-4c24-a732-39ccc0ddbcf1\") " Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566676 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c" (UID: "0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566921 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566987 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4c68a42-f073-4460-82f4-d5ddaaa26b05-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566999 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.567010 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45537606-07d7-49ee-abed-7aab10e9deab-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.567022 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzdjh\" (UniqueName: \"kubernetes.io/projected/45537606-07d7-49ee-abed-7aab10e9deab-kube-api-access-xzdjh\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.567034 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp2nx\" (UniqueName: \"kubernetes.io/projected/71dfb945-64cc-4994-bd4e-591797fc6ca8-kube-api-access-fp2nx\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.567046 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71dfb945-64cc-4994-bd4e-591797fc6ca8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.567058 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfzbw\" (UniqueName: \"kubernetes.io/projected/a4c68a42-f073-4460-82f4-d5ddaaa26b05-kube-api-access-qfzbw\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.567069 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k27x\" (UniqueName: \"kubernetes.io/projected/1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95-kube-api-access-9k27x\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.566952 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/859f4c18-f94d-49bd-9287-ffad04cbe5d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "859f4c18-f94d-49bd-9287-ffad04cbe5d9" (UID: "859f4c18-f94d-49bd-9287-ffad04cbe5d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.567231 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da559c8c-1db0-49af-9b45-22af3a40eccf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da559c8c-1db0-49af-9b45-22af3a40eccf" (UID: "da559c8c-1db0-49af-9b45-22af3a40eccf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.567912 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/622c7c3a-8ac1-4286-8bcf-b1444608c489-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "622c7c3a-8ac1-4286-8bcf-b1444608c489" (UID: "622c7c3a-8ac1-4286-8bcf-b1444608c489"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.569723 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/859f4c18-f94d-49bd-9287-ffad04cbe5d9-kube-api-access-6qtqb" (OuterVolumeSpecName: "kube-api-access-6qtqb") pod "859f4c18-f94d-49bd-9287-ffad04cbe5d9" (UID: "859f4c18-f94d-49bd-9287-ffad04cbe5d9"). InnerVolumeSpecName "kube-api-access-6qtqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.570447 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c-kube-api-access-hlcf5" (OuterVolumeSpecName: "kube-api-access-hlcf5") pod "0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c" (UID: "0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c"). InnerVolumeSpecName "kube-api-access-hlcf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.571018 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da559c8c-1db0-49af-9b45-22af3a40eccf-kube-api-access-g8tg2" (OuterVolumeSpecName: "kube-api-access-g8tg2") pod "da559c8c-1db0-49af-9b45-22af3a40eccf" (UID: "da559c8c-1db0-49af-9b45-22af3a40eccf"). InnerVolumeSpecName "kube-api-access-g8tg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.571064 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5feba85a-c965-4c24-a732-39ccc0ddbcf1-kube-api-access-jxppw" (OuterVolumeSpecName: "kube-api-access-jxppw") pod "5feba85a-c965-4c24-a732-39ccc0ddbcf1" (UID: "5feba85a-c965-4c24-a732-39ccc0ddbcf1"). InnerVolumeSpecName "kube-api-access-jxppw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.571644 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/622c7c3a-8ac1-4286-8bcf-b1444608c489-kube-api-access-2q557" (OuterVolumeSpecName: "kube-api-access-2q557") pod "622c7c3a-8ac1-4286-8bcf-b1444608c489" (UID: "622c7c3a-8ac1-4286-8bcf-b1444608c489"). InnerVolumeSpecName "kube-api-access-2q557". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.610703 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5feba85a-c965-4c24-a732-39ccc0ddbcf1" (UID: "5feba85a-c965-4c24-a732-39ccc0ddbcf1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.610749 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5feba85a-c965-4c24-a732-39ccc0ddbcf1" (UID: "5feba85a-c965-4c24-a732-39ccc0ddbcf1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.611308 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-config" (OuterVolumeSpecName: "config") pod "5feba85a-c965-4c24-a732-39ccc0ddbcf1" (UID: "5feba85a-c965-4c24-a732-39ccc0ddbcf1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.613469 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5feba85a-c965-4c24-a732-39ccc0ddbcf1" (UID: "5feba85a-c965-4c24-a732-39ccc0ddbcf1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.668605 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qtqb\" (UniqueName: \"kubernetes.io/projected/859f4c18-f94d-49bd-9287-ffad04cbe5d9-kube-api-access-6qtqb\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.668644 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.668653 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlcf5\" (UniqueName: \"kubernetes.io/projected/0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c-kube-api-access-hlcf5\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.668661 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8tg2\" (UniqueName: \"kubernetes.io/projected/da559c8c-1db0-49af-9b45-22af3a40eccf-kube-api-access-g8tg2\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.668671 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.668682 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859f4c18-f94d-49bd-9287-ffad04cbe5d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.668690 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.668699 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da559c8c-1db0-49af-9b45-22af3a40eccf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.668708 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q557\" (UniqueName: \"kubernetes.io/projected/622c7c3a-8ac1-4286-8bcf-b1444608c489-kube-api-access-2q557\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.668717 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxppw\" (UniqueName: \"kubernetes.io/projected/5feba85a-c965-4c24-a732-39ccc0ddbcf1-kube-api-access-jxppw\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.668727 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/622c7c3a-8ac1-4286-8bcf-b1444608c489-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:23 crc kubenswrapper[4756]: I0318 14:21:23.668737 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5feba85a-c965-4c24-a732-39ccc0ddbcf1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.331611 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xtr5q" event={"ID":"fcdc6f55-de64-4698-9c24-35d42eca014c","Type":"ContainerStarted","Data":"6fc9a7d1d02d193d67a567672382370fbce3d244cddd47ce909eae652a02e425"} Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.334336 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-tw5k4" event={"ID":"5feba85a-c965-4c24-a732-39ccc0ddbcf1","Type":"ContainerDied","Data":"244e8cc88a7ba0703e8e51c94dcabbacf3f6cb9febb6652c9b741ceecef71cd1"} Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.334357 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2e41-account-create-update-8hh7d" Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.334381 4756 scope.go:117] "RemoveContainer" containerID="c8105ff4eb910d4758dd4fae4a3311f0bc76a9446cfdcfbdf17d461a45e4b10a" Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.334376 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dq884" Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.334393 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-f6cr8" Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.334451 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1934-account-create-update-d94f6" Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.334486 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6vpwc" Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.334495 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-tw5k4" Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.334504 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d7ef-account-create-update-6xjvm" Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.334501 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-8583-account-create-update-4lbfs" Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.334658 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jrvxc" Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.391344 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-xtr5q" podStartSLOduration=2.870099246 podStartE2EDuration="7.391324086s" podCreationTimestamp="2026-03-18 14:21:17 +0000 UTC" firstStartedPulling="2026-03-18 14:21:18.616208353 +0000 UTC m=+1279.930626328" lastFinishedPulling="2026-03-18 14:21:23.137433183 +0000 UTC m=+1284.451851168" observedRunningTime="2026-03-18 14:21:24.365725384 +0000 UTC m=+1285.680143399" watchObservedRunningTime="2026-03-18 14:21:24.391324086 +0000 UTC m=+1285.705742061" Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.393328 4756 scope.go:117] "RemoveContainer" containerID="eb33c1d4ad532d989ba7adbe90850f49596ea1c0cf12b6daf311e311b7a45432" Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.495590 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-tw5k4"] Mar 18 14:21:24 crc kubenswrapper[4756]: I0318 14:21:24.505925 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-tw5k4"] Mar 18 14:21:25 crc kubenswrapper[4756]: I0318 14:21:25.341868 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5feba85a-c965-4c24-a732-39ccc0ddbcf1" path="/var/lib/kubelet/pods/5feba85a-c965-4c24-a732-39ccc0ddbcf1/volumes" Mar 18 14:21:27 crc kubenswrapper[4756]: I0318 14:21:27.374724 4756 generic.go:334] "Generic (PLEG): container finished" podID="fcdc6f55-de64-4698-9c24-35d42eca014c" containerID="6fc9a7d1d02d193d67a567672382370fbce3d244cddd47ce909eae652a02e425" exitCode=0 Mar 18 14:21:27 crc kubenswrapper[4756]: I0318 14:21:27.374856 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xtr5q" event={"ID":"fcdc6f55-de64-4698-9c24-35d42eca014c","Type":"ContainerDied","Data":"6fc9a7d1d02d193d67a567672382370fbce3d244cddd47ce909eae652a02e425"} Mar 18 14:21:28 crc kubenswrapper[4756]: I0318 14:21:28.759696 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xtr5q" Mar 18 14:21:28 crc kubenswrapper[4756]: I0318 14:21:28.879420 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdc6f55-de64-4698-9c24-35d42eca014c-config-data\") pod \"fcdc6f55-de64-4698-9c24-35d42eca014c\" (UID: \"fcdc6f55-de64-4698-9c24-35d42eca014c\") " Mar 18 14:21:28 crc kubenswrapper[4756]: I0318 14:21:28.879650 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqj9b\" (UniqueName: \"kubernetes.io/projected/fcdc6f55-de64-4698-9c24-35d42eca014c-kube-api-access-lqj9b\") pod \"fcdc6f55-de64-4698-9c24-35d42eca014c\" (UID: \"fcdc6f55-de64-4698-9c24-35d42eca014c\") " Mar 18 14:21:28 crc kubenswrapper[4756]: I0318 14:21:28.879704 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdc6f55-de64-4698-9c24-35d42eca014c-combined-ca-bundle\") pod \"fcdc6f55-de64-4698-9c24-35d42eca014c\" (UID: \"fcdc6f55-de64-4698-9c24-35d42eca014c\") " Mar 18 14:21:28 crc kubenswrapper[4756]: I0318 14:21:28.885945 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcdc6f55-de64-4698-9c24-35d42eca014c-kube-api-access-lqj9b" (OuterVolumeSpecName: "kube-api-access-lqj9b") pod "fcdc6f55-de64-4698-9c24-35d42eca014c" (UID: "fcdc6f55-de64-4698-9c24-35d42eca014c"). InnerVolumeSpecName "kube-api-access-lqj9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:28 crc kubenswrapper[4756]: I0318 14:21:28.918749 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcdc6f55-de64-4698-9c24-35d42eca014c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcdc6f55-de64-4698-9c24-35d42eca014c" (UID: "fcdc6f55-de64-4698-9c24-35d42eca014c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:28 crc kubenswrapper[4756]: I0318 14:21:28.949154 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcdc6f55-de64-4698-9c24-35d42eca014c-config-data" (OuterVolumeSpecName: "config-data") pod "fcdc6f55-de64-4698-9c24-35d42eca014c" (UID: "fcdc6f55-de64-4698-9c24-35d42eca014c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:28 crc kubenswrapper[4756]: I0318 14:21:28.982087 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcdc6f55-de64-4698-9c24-35d42eca014c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:28 crc kubenswrapper[4756]: I0318 14:21:28.982159 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcdc6f55-de64-4698-9c24-35d42eca014c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:28 crc kubenswrapper[4756]: I0318 14:21:28.982180 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqj9b\" (UniqueName: \"kubernetes.io/projected/fcdc6f55-de64-4698-9c24-35d42eca014c-kube-api-access-lqj9b\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.400842 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xtr5q" event={"ID":"fcdc6f55-de64-4698-9c24-35d42eca014c","Type":"ContainerDied","Data":"117daefdda159ebab5642ab8e4956839d154110c9dbe1183490d497b177b3a96"} Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.401250 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="117daefdda159ebab5642ab8e4956839d154110c9dbe1183490d497b177b3a96" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.400905 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xtr5q" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.762964 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-9s9ss"] Mar 18 14:21:29 crc kubenswrapper[4756]: E0318 14:21:29.763473 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="622c7c3a-8ac1-4286-8bcf-b1444608c489" containerName="mariadb-database-create" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763491 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="622c7c3a-8ac1-4286-8bcf-b1444608c489" containerName="mariadb-database-create" Mar 18 14:21:29 crc kubenswrapper[4756]: E0318 14:21:29.763522 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45537606-07d7-49ee-abed-7aab10e9deab" containerName="mariadb-database-create" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763530 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="45537606-07d7-49ee-abed-7aab10e9deab" containerName="mariadb-database-create" Mar 18 14:21:29 crc kubenswrapper[4756]: E0318 14:21:29.763542 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e57d08-4793-42ee-b2dc-a3d7b6828f23" containerName="ovn-config" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763550 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e57d08-4793-42ee-b2dc-a3d7b6828f23" containerName="ovn-config" Mar 18 14:21:29 crc kubenswrapper[4756]: E0318 14:21:29.763562 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcdc6f55-de64-4698-9c24-35d42eca014c" containerName="keystone-db-sync" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763570 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcdc6f55-de64-4698-9c24-35d42eca014c" containerName="keystone-db-sync" Mar 18 14:21:29 crc kubenswrapper[4756]: E0318 14:21:29.763583 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5feba85a-c965-4c24-a732-39ccc0ddbcf1" containerName="dnsmasq-dns" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763590 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5feba85a-c965-4c24-a732-39ccc0ddbcf1" containerName="dnsmasq-dns" Mar 18 14:21:29 crc kubenswrapper[4756]: E0318 14:21:29.763604 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c68a42-f073-4460-82f4-d5ddaaa26b05" containerName="mariadb-database-create" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763613 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c68a42-f073-4460-82f4-d5ddaaa26b05" containerName="mariadb-database-create" Mar 18 14:21:29 crc kubenswrapper[4756]: E0318 14:21:29.763623 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da559c8c-1db0-49af-9b45-22af3a40eccf" containerName="mariadb-account-create-update" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763630 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="da559c8c-1db0-49af-9b45-22af3a40eccf" containerName="mariadb-account-create-update" Mar 18 14:21:29 crc kubenswrapper[4756]: E0318 14:21:29.763651 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71dfb945-64cc-4994-bd4e-591797fc6ca8" containerName="mariadb-account-create-update" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763659 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="71dfb945-64cc-4994-bd4e-591797fc6ca8" containerName="mariadb-account-create-update" Mar 18 14:21:29 crc kubenswrapper[4756]: E0318 14:21:29.763671 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95" containerName="mariadb-database-create" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763679 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95" containerName="mariadb-database-create" Mar 18 14:21:29 crc kubenswrapper[4756]: E0318 14:21:29.763698 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="859f4c18-f94d-49bd-9287-ffad04cbe5d9" containerName="mariadb-account-create-update" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763706 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="859f4c18-f94d-49bd-9287-ffad04cbe5d9" containerName="mariadb-account-create-update" Mar 18 14:21:29 crc kubenswrapper[4756]: E0318 14:21:29.763726 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c" containerName="mariadb-account-create-update" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763735 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c" containerName="mariadb-account-create-update" Mar 18 14:21:29 crc kubenswrapper[4756]: E0318 14:21:29.763746 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5feba85a-c965-4c24-a732-39ccc0ddbcf1" containerName="init" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763753 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5feba85a-c965-4c24-a732-39ccc0ddbcf1" containerName="init" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763977 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="da559c8c-1db0-49af-9b45-22af3a40eccf" containerName="mariadb-account-create-update" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.763996 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e57d08-4793-42ee-b2dc-a3d7b6828f23" containerName="ovn-config" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.764010 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="71dfb945-64cc-4994-bd4e-591797fc6ca8" containerName="mariadb-account-create-update" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.764022 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5feba85a-c965-4c24-a732-39ccc0ddbcf1" containerName="dnsmasq-dns" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.764033 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="45537606-07d7-49ee-abed-7aab10e9deab" containerName="mariadb-database-create" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.764042 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcdc6f55-de64-4698-9c24-35d42eca014c" containerName="keystone-db-sync" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.764053 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="622c7c3a-8ac1-4286-8bcf-b1444608c489" containerName="mariadb-database-create" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.764070 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95" containerName="mariadb-database-create" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.764080 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="859f4c18-f94d-49bd-9287-ffad04cbe5d9" containerName="mariadb-account-create-update" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.764089 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c68a42-f073-4460-82f4-d5ddaaa26b05" containerName="mariadb-database-create" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.764106 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c" containerName="mariadb-account-create-update" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.765426 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.771390 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vtxsp"] Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.772522 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.775576 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.775845 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jstgl" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.775953 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.776063 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.776214 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.795227 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-9s9ss"] Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.808221 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vtxsp"] Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.899033 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-combined-ca-bundle\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.899107 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-dns-svc\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.899159 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.899189 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.899213 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzbqr\" (UniqueName: \"kubernetes.io/projected/1726f323-2352-403b-91cd-87f37d02bbd6-kube-api-access-rzbqr\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.899255 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-config\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.899276 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-config-data\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.899297 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-scripts\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.899314 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.899348 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjgwr\" (UniqueName: \"kubernetes.io/projected/cca843b7-8858-4033-8d07-5de75be06ce4-kube-api-access-qjgwr\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.899382 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-credential-keys\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.899404 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-fernet-keys\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.936707 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-rpx8m"] Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.937792 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.943900 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.944085 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ws4s4" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.944203 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 14:21:29 crc kubenswrapper[4756]: I0318 14:21:29.976724 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rpx8m"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.006827 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-dns-svc\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.006902 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.006933 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.006980 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-db-sync-config-data\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007008 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzbqr\" (UniqueName: \"kubernetes.io/projected/1726f323-2352-403b-91cd-87f37d02bbd6-kube-api-access-rzbqr\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007039 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-config-data\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007075 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-combined-ca-bundle\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007106 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-config\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007146 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-config-data\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007172 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-etc-machine-id\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007196 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-scripts\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007221 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007243 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjgwr\" (UniqueName: \"kubernetes.io/projected/cca843b7-8858-4033-8d07-5de75be06ce4-kube-api-access-qjgwr\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007291 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-credential-keys\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007325 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-fernet-keys\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007368 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-scripts\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007412 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26xbh\" (UniqueName: \"kubernetes.io/projected/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-kube-api-access-26xbh\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.007431 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-combined-ca-bundle\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.012763 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-dns-svc\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.013334 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.013859 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.015352 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.016078 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-config\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.026462 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-combined-ca-bundle\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.039545 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-config-data\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.048793 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-credential-keys\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.050915 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzbqr\" (UniqueName: \"kubernetes.io/projected/1726f323-2352-403b-91cd-87f37d02bbd6-kube-api-access-rzbqr\") pod \"dnsmasq-dns-847c4cc679-9s9ss\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.056868 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjgwr\" (UniqueName: \"kubernetes.io/projected/cca843b7-8858-4033-8d07-5de75be06ce4-kube-api-access-qjgwr\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.057239 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-scripts\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.063206 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-fernet-keys\") pod \"keystone-bootstrap-vtxsp\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.099519 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.110286 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26xbh\" (UniqueName: \"kubernetes.io/projected/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-kube-api-access-26xbh\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.110360 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-db-sync-config-data\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.110391 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-config-data\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.110417 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-combined-ca-bundle\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.110446 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-etc-machine-id\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.110502 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-scripts\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.117822 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-etc-machine-id\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.120193 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.130032 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ghbmc"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.131237 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-scripts\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.132500 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-combined-ca-bundle\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.132821 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ghbmc" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.132819 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-db-sync-config-data\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.142555 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xdszx" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.143029 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.143339 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-config-data\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.159729 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26xbh\" (UniqueName: \"kubernetes.io/projected/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-kube-api-access-26xbh\") pod \"cinder-db-sync-rpx8m\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.170585 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ghbmc"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.192211 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-z7pth"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.193421 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z7pth" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.202982 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.203420 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.203631 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4vv6g" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.205433 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.224591 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-tpt8q"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.225893 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.231404 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.231423 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-d7k67" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.231790 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.233298 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d52c352-cfd7-4679-912b-11f753c7831f-db-sync-config-data\") pod \"barbican-db-sync-ghbmc\" (UID: \"5d52c352-cfd7-4679-912b-11f753c7831f\") " pod="openstack/barbican-db-sync-ghbmc" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.233455 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-combined-ca-bundle\") pod \"neutron-db-sync-z7pth\" (UID: \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\") " pod="openstack/neutron-db-sync-z7pth" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.233561 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d74jm\" (UniqueName: \"kubernetes.io/projected/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-kube-api-access-d74jm\") pod \"neutron-db-sync-z7pth\" (UID: \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\") " pod="openstack/neutron-db-sync-z7pth" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.233690 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-config\") pod \"neutron-db-sync-z7pth\" (UID: \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\") " pod="openstack/neutron-db-sync-z7pth" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.233712 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d52c352-cfd7-4679-912b-11f753c7831f-combined-ca-bundle\") pod \"barbican-db-sync-ghbmc\" (UID: \"5d52c352-cfd7-4679-912b-11f753c7831f\") " pod="openstack/barbican-db-sync-ghbmc" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.233733 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bgxj\" (UniqueName: \"kubernetes.io/projected/5d52c352-cfd7-4679-912b-11f753c7831f-kube-api-access-8bgxj\") pod \"barbican-db-sync-ghbmc\" (UID: \"5d52c352-cfd7-4679-912b-11f753c7831f\") " pod="openstack/barbican-db-sync-ghbmc" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.244696 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.263647 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-tpt8q"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.285356 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z7pth"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.333994 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-9s9ss"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.335023 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-scripts\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.335176 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d52c352-cfd7-4679-912b-11f753c7831f-db-sync-config-data\") pod \"barbican-db-sync-ghbmc\" (UID: \"5d52c352-cfd7-4679-912b-11f753c7831f\") " pod="openstack/barbican-db-sync-ghbmc" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.335270 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8tcf\" (UniqueName: \"kubernetes.io/projected/7401eed2-7f0c-4f80-932b-5bc2df6684f8-kube-api-access-l8tcf\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.335386 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-combined-ca-bundle\") pod \"neutron-db-sync-z7pth\" (UID: \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\") " pod="openstack/neutron-db-sync-z7pth" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.335452 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-config-data\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.335547 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d74jm\" (UniqueName: \"kubernetes.io/projected/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-kube-api-access-d74jm\") pod \"neutron-db-sync-z7pth\" (UID: \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\") " pod="openstack/neutron-db-sync-z7pth" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.335623 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7401eed2-7f0c-4f80-932b-5bc2df6684f8-certs\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.335717 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-combined-ca-bundle\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.335793 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-config\") pod \"neutron-db-sync-z7pth\" (UID: \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\") " pod="openstack/neutron-db-sync-z7pth" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.335859 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d52c352-cfd7-4679-912b-11f753c7831f-combined-ca-bundle\") pod \"barbican-db-sync-ghbmc\" (UID: \"5d52c352-cfd7-4679-912b-11f753c7831f\") " pod="openstack/barbican-db-sync-ghbmc" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.335923 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bgxj\" (UniqueName: \"kubernetes.io/projected/5d52c352-cfd7-4679-912b-11f753c7831f-kube-api-access-8bgxj\") pod \"barbican-db-sync-ghbmc\" (UID: \"5d52c352-cfd7-4679-912b-11f753c7831f\") " pod="openstack/barbican-db-sync-ghbmc" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.339234 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d52c352-cfd7-4679-912b-11f753c7831f-db-sync-config-data\") pod \"barbican-db-sync-ghbmc\" (UID: \"5d52c352-cfd7-4679-912b-11f753c7831f\") " pod="openstack/barbican-db-sync-ghbmc" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.342088 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-combined-ca-bundle\") pod \"neutron-db-sync-z7pth\" (UID: \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\") " pod="openstack/neutron-db-sync-z7pth" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.346543 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d52c352-cfd7-4679-912b-11f753c7831f-combined-ca-bundle\") pod \"barbican-db-sync-ghbmc\" (UID: \"5d52c352-cfd7-4679-912b-11f753c7831f\") " pod="openstack/barbican-db-sync-ghbmc" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.351610 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-lvdn6"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.351915 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-config\") pod \"neutron-db-sync-z7pth\" (UID: \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\") " pod="openstack/neutron-db-sync-z7pth" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.354507 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.356977 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d74jm\" (UniqueName: \"kubernetes.io/projected/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-kube-api-access-d74jm\") pod \"neutron-db-sync-z7pth\" (UID: \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\") " pod="openstack/neutron-db-sync-z7pth" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.363333 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-lvdn6"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.374275 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.374354 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-cm9sh" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.374464 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.375653 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.378422 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bgxj\" (UniqueName: \"kubernetes.io/projected/5d52c352-cfd7-4679-912b-11f753c7831f-kube-api-access-8bgxj\") pod \"barbican-db-sync-ghbmc\" (UID: \"5d52c352-cfd7-4679-912b-11f753c7831f\") " pod="openstack/barbican-db-sync-ghbmc" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.378873 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.381529 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.386635 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.405719 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.417963 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-t4jg8"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.421635 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.437047 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a791a6a-0e50-465a-90cf-e4af5bdc12de-log-httpd\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.437074 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql966\" (UniqueName: \"kubernetes.io/projected/4a791a6a-0e50-465a-90cf-e4af5bdc12de-kube-api-access-ql966\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439280 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8tcf\" (UniqueName: \"kubernetes.io/projected/7401eed2-7f0c-4f80-932b-5bc2df6684f8-kube-api-access-l8tcf\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439381 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439416 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-config-data\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439471 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjtth\" (UniqueName: \"kubernetes.io/projected/18b4eda4-6808-4165-8b7c-e24dc046467c-kube-api-access-jjtth\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439502 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-config-data\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439548 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18b4eda4-6808-4165-8b7c-e24dc046467c-logs\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439573 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7401eed2-7f0c-4f80-932b-5bc2df6684f8-certs\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439600 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439641 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-combined-ca-bundle\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439689 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a791a6a-0e50-465a-90cf-e4af5bdc12de-run-httpd\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439725 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-combined-ca-bundle\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439767 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-scripts\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439781 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-config-data\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439806 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-scripts\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.439836 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-scripts\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.449870 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-t4jg8"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.451414 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-combined-ca-bundle\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.460649 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7401eed2-7f0c-4f80-932b-5bc2df6684f8-certs\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.462661 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8tcf\" (UniqueName: \"kubernetes.io/projected/7401eed2-7f0c-4f80-932b-5bc2df6684f8-kube-api-access-l8tcf\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.473877 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-scripts\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.476397 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-config-data\") pod \"cloudkitty-db-sync-tpt8q\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541007 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-scripts\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541057 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-scripts\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541091 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-config\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541130 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a791a6a-0e50-465a-90cf-e4af5bdc12de-log-httpd\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541146 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql966\" (UniqueName: \"kubernetes.io/projected/4a791a6a-0e50-465a-90cf-e4af5bdc12de-kube-api-access-ql966\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541179 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541207 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541230 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89rqv\" (UniqueName: \"kubernetes.io/projected/2279186d-a6bf-4a99-a62f-6f1a0a405269-kube-api-access-89rqv\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541266 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjtth\" (UniqueName: \"kubernetes.io/projected/18b4eda4-6808-4165-8b7c-e24dc046467c-kube-api-access-jjtth\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541289 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-config-data\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541313 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18b4eda4-6808-4165-8b7c-e24dc046467c-logs\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541340 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541454 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541497 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a791a6a-0e50-465a-90cf-e4af5bdc12de-run-httpd\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541514 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541532 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541552 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-combined-ca-bundle\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.541578 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-config-data\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.543268 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a791a6a-0e50-465a-90cf-e4af5bdc12de-run-httpd\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.545003 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18b4eda4-6808-4165-8b7c-e24dc046467c-logs\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.545287 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a791a6a-0e50-465a-90cf-e4af5bdc12de-log-httpd\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.545802 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-scripts\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.548710 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-config-data\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.554258 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-combined-ca-bundle\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.554640 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-scripts\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.555262 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-config-data\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.558495 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.558900 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.564797 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjtth\" (UniqueName: \"kubernetes.io/projected/18b4eda4-6808-4165-8b7c-e24dc046467c-kube-api-access-jjtth\") pod \"placement-db-sync-lvdn6\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.570307 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql966\" (UniqueName: \"kubernetes.io/projected/4a791a6a-0e50-465a-90cf-e4af5bdc12de-kube-api-access-ql966\") pod \"ceilometer-0\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.579605 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ghbmc" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.613132 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z7pth" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.634853 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.643146 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.643221 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.643244 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.643332 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-config\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.643473 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.644280 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89rqv\" (UniqueName: \"kubernetes.io/projected/2279186d-a6bf-4a99-a62f-6f1a0a405269-kube-api-access-89rqv\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.644439 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.645093 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.645339 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-config\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.650928 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.661683 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.683709 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89rqv\" (UniqueName: \"kubernetes.io/projected/2279186d-a6bf-4a99-a62f-6f1a0a405269-kube-api-access-89rqv\") pod \"dnsmasq-dns-785d8bcb8c-t4jg8\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.693772 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lvdn6" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.705670 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.774452 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.900249 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-9s9ss"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.916712 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.918160 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.922102 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.922207 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.922435 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.922543 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rwf89" Mar 18 14:21:30 crc kubenswrapper[4756]: W0318 14:21:30.924738 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1726f323_2352_403b_91cd_87f37d02bbd6.slice/crio-f4f4b3ce132c8491b398a5e98fa6a4053fdea7b4d1b6ca43697d65f58c1a8b19 WatchSource:0}: Error finding container f4f4b3ce132c8491b398a5e98fa6a4053fdea7b4d1b6ca43697d65f58c1a8b19: Status 404 returned error can't find the container with id f4f4b3ce132c8491b398a5e98fa6a4053fdea7b4d1b6ca43697d65f58c1a8b19 Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.931917 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.975705 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.977554 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.985873 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.986417 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 14:21:30 crc kubenswrapper[4756]: I0318 14:21:30.987156 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.068766 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.068877 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-scripts\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.068938 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.069004 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-config-data\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.069060 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/420186eb-4b69-4d45-a7c2-1d65009f2788-logs\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.069096 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-config-data\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.069173 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq7dl\" (UniqueName: \"kubernetes.io/projected/147bdf88-f75b-400a-9150-794ad2ebdb6f-kube-api-access-qq7dl\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.069296 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/420186eb-4b69-4d45-a7c2-1d65009f2788-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.069331 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-scripts\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.069363 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fd8f\" (UniqueName: \"kubernetes.io/projected/420186eb-4b69-4d45-a7c2-1d65009f2788-kube-api-access-2fd8f\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.069406 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/147bdf88-f75b-400a-9150-794ad2ebdb6f-logs\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.069430 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.069662 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.069780 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/147bdf88-f75b-400a-9150-794ad2ebdb6f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.069994 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.070071 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.100533 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vtxsp"] Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.114212 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rpx8m"] Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.174565 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/420186eb-4b69-4d45-a7c2-1d65009f2788-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.174751 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-scripts\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.174841 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fd8f\" (UniqueName: \"kubernetes.io/projected/420186eb-4b69-4d45-a7c2-1d65009f2788-kube-api-access-2fd8f\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.174940 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/147bdf88-f75b-400a-9150-794ad2ebdb6f-logs\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.175087 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.175207 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.175287 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/147bdf88-f75b-400a-9150-794ad2ebdb6f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.175410 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.175502 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.175640 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.175706 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-scripts\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.175831 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.175916 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-config-data\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.175995 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/420186eb-4b69-4d45-a7c2-1d65009f2788-logs\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.176081 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-config-data\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.176193 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq7dl\" (UniqueName: \"kubernetes.io/projected/147bdf88-f75b-400a-9150-794ad2ebdb6f-kube-api-access-qq7dl\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.176105 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/147bdf88-f75b-400a-9150-794ad2ebdb6f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.176486 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/420186eb-4b69-4d45-a7c2-1d65009f2788-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.180776 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.176088 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/147bdf88-f75b-400a-9150-794ad2ebdb6f-logs\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.181396 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/420186eb-4b69-4d45-a7c2-1d65009f2788-logs\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.181595 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-scripts\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.184731 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-config-data\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.187005 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.188377 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.196138 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.206297 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2d16b602ee6d280a584346f693cabe12df6f04b4d4e7d81a050a495635db21be/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.202369 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.199497 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.206456 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/14728f060a3b5046b333048907b438ee0376fa68800afec942964a27fea1d4a8/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.203010 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-config-data\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.235301 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-scripts\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.247522 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq7dl\" (UniqueName: \"kubernetes.io/projected/147bdf88-f75b-400a-9150-794ad2ebdb6f-kube-api-access-qq7dl\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.248232 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fd8f\" (UniqueName: \"kubernetes.io/projected/420186eb-4b69-4d45-a7c2-1d65009f2788-kube-api-access-2fd8f\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.255252 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ghbmc"] Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.278807 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"glance-default-internal-api-0\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.299296 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"glance-default-external-api-0\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.419386 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z7pth"] Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.451349 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-tpt8q"] Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.453743 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rpx8m" event={"ID":"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e","Type":"ContainerStarted","Data":"de2d616707bd338f3f128c3e5089c8f320f9fc76b5c4e75d6bde34833c945c41"} Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.455162 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ghbmc" event={"ID":"5d52c352-cfd7-4679-912b-11f753c7831f","Type":"ContainerStarted","Data":"eafbcd01aab50e5cb9c6faafbdea009f25c3a6455cec345adf6e1ceb52b7fffb"} Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.458231 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" event={"ID":"1726f323-2352-403b-91cd-87f37d02bbd6","Type":"ContainerStarted","Data":"f4f4b3ce132c8491b398a5e98fa6a4053fdea7b4d1b6ca43697d65f58c1a8b19"} Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.459137 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vtxsp" event={"ID":"cca843b7-8858-4033-8d07-5de75be06ce4","Type":"ContainerStarted","Data":"3d2f533ab7ba2c9a4cac9f891377d6b3c286c21ec9555ae3b514f1e5e564eae0"} Mar 18 14:21:31 crc kubenswrapper[4756]: W0318 14:21:31.464762 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bb78a1f_b53a_4b17_8b91_a70da1bb9071.slice/crio-58fd95c30e59944d6780108b9a99f78ddc43c3e77d7746525d269764e53ac6bb WatchSource:0}: Error finding container 58fd95c30e59944d6780108b9a99f78ddc43c3e77d7746525d269764e53ac6bb: Status 404 returned error can't find the container with id 58fd95c30e59944d6780108b9a99f78ddc43c3e77d7746525d269764e53ac6bb Mar 18 14:21:31 crc kubenswrapper[4756]: W0318 14:21:31.465225 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7401eed2_7f0c_4f80_932b_5bc2df6684f8.slice/crio-613d75d846910596582c301beac7223f31a73f201fe536624a7a35b859389892 WatchSource:0}: Error finding container 613d75d846910596582c301beac7223f31a73f201fe536624a7a35b859389892: Status 404 returned error can't find the container with id 613d75d846910596582c301beac7223f31a73f201fe536624a7a35b859389892 Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.494606 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.554897 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.609069 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-lvdn6"] Mar 18 14:21:31 crc kubenswrapper[4756]: W0318 14:21:31.612248 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18b4eda4_6808_4165_8b7c_e24dc046467c.slice/crio-36ebabe9a17cc18caafe7a68d265478cc861065857e7d788137afa03c454132d WatchSource:0}: Error finding container 36ebabe9a17cc18caafe7a68d265478cc861065857e7d788137afa03c454132d: Status 404 returned error can't find the container with id 36ebabe9a17cc18caafe7a68d265478cc861065857e7d788137afa03c454132d Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.679268 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-t4jg8"] Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.698676 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:21:31 crc kubenswrapper[4756]: I0318 14:21:31.971866 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.150670 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.264895 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.334554 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.441108 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.504969 4756 generic.go:334] "Generic (PLEG): container finished" podID="1726f323-2352-403b-91cd-87f37d02bbd6" containerID="7a9ce98cc69d9141e62750c9461fbdfeafa32d6682c6469b10f864ffde5683b8" exitCode=0 Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.505045 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" event={"ID":"1726f323-2352-403b-91cd-87f37d02bbd6","Type":"ContainerDied","Data":"7a9ce98cc69d9141e62750c9461fbdfeafa32d6682c6469b10f864ffde5683b8"} Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.508081 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"420186eb-4b69-4d45-a7c2-1d65009f2788","Type":"ContainerStarted","Data":"431f52071baf78fee04f2fa1ef5c785116bbd2930f7addcf436df2100e403ca5"} Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.509910 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vtxsp" event={"ID":"cca843b7-8858-4033-8d07-5de75be06ce4","Type":"ContainerStarted","Data":"4daeea50cec98c5deb566ebece1d6febddb5cded68a2ecf68c2ff6f17ca9ff3c"} Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.512588 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lvdn6" event={"ID":"18b4eda4-6808-4165-8b7c-e24dc046467c","Type":"ContainerStarted","Data":"36ebabe9a17cc18caafe7a68d265478cc861065857e7d788137afa03c454132d"} Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.514858 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a791a6a-0e50-465a-90cf-e4af5bdc12de","Type":"ContainerStarted","Data":"b881e246a0058c4d718cf673d769f0b3b79a465134dd30fe87e792a89dc748ef"} Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.516498 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z7pth" event={"ID":"4bb78a1f-b53a-4b17-8b91-a70da1bb9071","Type":"ContainerStarted","Data":"b02d67eec55be30003a0e76f6969294b0e486d32472e917f8959eec771490a55"} Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.516518 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z7pth" event={"ID":"4bb78a1f-b53a-4b17-8b91-a70da1bb9071","Type":"ContainerStarted","Data":"58fd95c30e59944d6780108b9a99f78ddc43c3e77d7746525d269764e53ac6bb"} Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.520036 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-tpt8q" event={"ID":"7401eed2-7f0c-4f80-932b-5bc2df6684f8","Type":"ContainerStarted","Data":"613d75d846910596582c301beac7223f31a73f201fe536624a7a35b859389892"} Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.528256 4756 generic.go:334] "Generic (PLEG): container finished" podID="2279186d-a6bf-4a99-a62f-6f1a0a405269" containerID="bf9f08b75e9442f6ba522deacdcd2f7bb836a85753a6f877b721088042a46c83" exitCode=0 Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.528298 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" event={"ID":"2279186d-a6bf-4a99-a62f-6f1a0a405269","Type":"ContainerDied","Data":"bf9f08b75e9442f6ba522deacdcd2f7bb836a85753a6f877b721088042a46c83"} Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.528322 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" event={"ID":"2279186d-a6bf-4a99-a62f-6f1a0a405269","Type":"ContainerStarted","Data":"3f195469aa41e3789712c6e6ec5f0169af2c233fba7e27ed1ea819b58516a5a8"} Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.553319 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-z7pth" podStartSLOduration=2.553302098 podStartE2EDuration="2.553302098s" podCreationTimestamp="2026-03-18 14:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:21:32.54152701 +0000 UTC m=+1293.855944985" watchObservedRunningTime="2026-03-18 14:21:32.553302098 +0000 UTC m=+1293.867720073" Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.586349 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vtxsp" podStartSLOduration=3.586332661 podStartE2EDuration="3.586332661s" podCreationTimestamp="2026-03-18 14:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:21:32.556549276 +0000 UTC m=+1293.870967251" watchObservedRunningTime="2026-03-18 14:21:32.586332661 +0000 UTC m=+1293.900750636" Mar 18 14:21:32 crc kubenswrapper[4756]: I0318 14:21:32.975249 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.138783 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-dns-swift-storage-0\") pod \"1726f323-2352-403b-91cd-87f37d02bbd6\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.139067 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-dns-svc\") pod \"1726f323-2352-403b-91cd-87f37d02bbd6\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.139125 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-config\") pod \"1726f323-2352-403b-91cd-87f37d02bbd6\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.139182 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzbqr\" (UniqueName: \"kubernetes.io/projected/1726f323-2352-403b-91cd-87f37d02bbd6-kube-api-access-rzbqr\") pod \"1726f323-2352-403b-91cd-87f37d02bbd6\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.139805 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-ovsdbserver-nb\") pod \"1726f323-2352-403b-91cd-87f37d02bbd6\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.139877 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-ovsdbserver-sb\") pod \"1726f323-2352-403b-91cd-87f37d02bbd6\" (UID: \"1726f323-2352-403b-91cd-87f37d02bbd6\") " Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.148582 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1726f323-2352-403b-91cd-87f37d02bbd6-kube-api-access-rzbqr" (OuterVolumeSpecName: "kube-api-access-rzbqr") pod "1726f323-2352-403b-91cd-87f37d02bbd6" (UID: "1726f323-2352-403b-91cd-87f37d02bbd6"). InnerVolumeSpecName "kube-api-access-rzbqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.183189 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1726f323-2352-403b-91cd-87f37d02bbd6" (UID: "1726f323-2352-403b-91cd-87f37d02bbd6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.184799 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1726f323-2352-403b-91cd-87f37d02bbd6" (UID: "1726f323-2352-403b-91cd-87f37d02bbd6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.193767 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1726f323-2352-403b-91cd-87f37d02bbd6" (UID: "1726f323-2352-403b-91cd-87f37d02bbd6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.202386 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1726f323-2352-403b-91cd-87f37d02bbd6" (UID: "1726f323-2352-403b-91cd-87f37d02bbd6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.210267 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-config" (OuterVolumeSpecName: "config") pod "1726f323-2352-403b-91cd-87f37d02bbd6" (UID: "1726f323-2352-403b-91cd-87f37d02bbd6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.241562 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.241598 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.241609 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.241618 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.241627 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzbqr\" (UniqueName: \"kubernetes.io/projected/1726f323-2352-403b-91cd-87f37d02bbd6-kube-api-access-rzbqr\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.241636 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1726f323-2352-403b-91cd-87f37d02bbd6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.567994 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" event={"ID":"2279186d-a6bf-4a99-a62f-6f1a0a405269","Type":"ContainerStarted","Data":"8afca66c131ab3c8ef6705d5594492be692fd18cebc663ccbb37cae1ccb2b7ba"} Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.569559 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.577531 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"420186eb-4b69-4d45-a7c2-1d65009f2788","Type":"ContainerStarted","Data":"1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579"} Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.582197 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" event={"ID":"1726f323-2352-403b-91cd-87f37d02bbd6","Type":"ContainerDied","Data":"f4f4b3ce132c8491b398a5e98fa6a4053fdea7b4d1b6ca43697d65f58c1a8b19"} Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.582250 4756 scope.go:117] "RemoveContainer" containerID="7a9ce98cc69d9141e62750c9461fbdfeafa32d6682c6469b10f864ffde5683b8" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.582390 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-9s9ss" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.588170 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"147bdf88-f75b-400a-9150-794ad2ebdb6f","Type":"ContainerStarted","Data":"c31b7f037ea289ef586d49f0772d50a6b40d66bb4e8ee8ee12638d5fc78ced68"} Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.607239 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" podStartSLOduration=3.607219125 podStartE2EDuration="3.607219125s" podCreationTimestamp="2026-03-18 14:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:21:33.589587598 +0000 UTC m=+1294.904005573" watchObservedRunningTime="2026-03-18 14:21:33.607219125 +0000 UTC m=+1294.921637100" Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.663409 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-9s9ss"] Mar 18 14:21:33 crc kubenswrapper[4756]: I0318 14:21:33.671111 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-9s9ss"] Mar 18 14:21:34 crc kubenswrapper[4756]: I0318 14:21:34.615479 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"420186eb-4b69-4d45-a7c2-1d65009f2788","Type":"ContainerStarted","Data":"674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2"} Mar 18 14:21:34 crc kubenswrapper[4756]: I0318 14:21:34.615622 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="420186eb-4b69-4d45-a7c2-1d65009f2788" containerName="glance-log" containerID="cri-o://1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579" gracePeriod=30 Mar 18 14:21:34 crc kubenswrapper[4756]: I0318 14:21:34.616182 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="420186eb-4b69-4d45-a7c2-1d65009f2788" containerName="glance-httpd" containerID="cri-o://674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2" gracePeriod=30 Mar 18 14:21:34 crc kubenswrapper[4756]: I0318 14:21:34.619276 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"147bdf88-f75b-400a-9150-794ad2ebdb6f","Type":"ContainerStarted","Data":"a9ea09330adb62340da11e5e2b97ff53150cd2548bd08eed758cb633aa7e222a"} Mar 18 14:21:34 crc kubenswrapper[4756]: I0318 14:21:34.649347 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.649329782 podStartE2EDuration="5.649329782s" podCreationTimestamp="2026-03-18 14:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:21:34.641375077 +0000 UTC m=+1295.955793052" watchObservedRunningTime="2026-03-18 14:21:34.649329782 +0000 UTC m=+1295.963747747" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.318768 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.345150 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1726f323-2352-403b-91cd-87f37d02bbd6" path="/var/lib/kubelet/pods/1726f323-2352-403b-91cd-87f37d02bbd6/volumes" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.504270 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-scripts\") pod \"420186eb-4b69-4d45-a7c2-1d65009f2788\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.504713 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/420186eb-4b69-4d45-a7c2-1d65009f2788-httpd-run\") pod \"420186eb-4b69-4d45-a7c2-1d65009f2788\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.504742 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-combined-ca-bundle\") pod \"420186eb-4b69-4d45-a7c2-1d65009f2788\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.504791 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/420186eb-4b69-4d45-a7c2-1d65009f2788-logs\") pod \"420186eb-4b69-4d45-a7c2-1d65009f2788\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.504848 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fd8f\" (UniqueName: \"kubernetes.io/projected/420186eb-4b69-4d45-a7c2-1d65009f2788-kube-api-access-2fd8f\") pod \"420186eb-4b69-4d45-a7c2-1d65009f2788\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.504914 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-config-data\") pod \"420186eb-4b69-4d45-a7c2-1d65009f2788\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.504943 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-internal-tls-certs\") pod \"420186eb-4b69-4d45-a7c2-1d65009f2788\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.505039 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"420186eb-4b69-4d45-a7c2-1d65009f2788\" (UID: \"420186eb-4b69-4d45-a7c2-1d65009f2788\") " Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.505007 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/420186eb-4b69-4d45-a7c2-1d65009f2788-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "420186eb-4b69-4d45-a7c2-1d65009f2788" (UID: "420186eb-4b69-4d45-a7c2-1d65009f2788"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.505604 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/420186eb-4b69-4d45-a7c2-1d65009f2788-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.506815 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/420186eb-4b69-4d45-a7c2-1d65009f2788-logs" (OuterVolumeSpecName: "logs") pod "420186eb-4b69-4d45-a7c2-1d65009f2788" (UID: "420186eb-4b69-4d45-a7c2-1d65009f2788"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.513250 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-scripts" (OuterVolumeSpecName: "scripts") pod "420186eb-4b69-4d45-a7c2-1d65009f2788" (UID: "420186eb-4b69-4d45-a7c2-1d65009f2788"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.524310 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/420186eb-4b69-4d45-a7c2-1d65009f2788-kube-api-access-2fd8f" (OuterVolumeSpecName: "kube-api-access-2fd8f") pod "420186eb-4b69-4d45-a7c2-1d65009f2788" (UID: "420186eb-4b69-4d45-a7c2-1d65009f2788"). InnerVolumeSpecName "kube-api-access-2fd8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.526432 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b" (OuterVolumeSpecName: "glance") pod "420186eb-4b69-4d45-a7c2-1d65009f2788" (UID: "420186eb-4b69-4d45-a7c2-1d65009f2788"). InnerVolumeSpecName "pvc-565e4be1-2f3b-410a-808c-677fff515f0b". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.546252 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "420186eb-4b69-4d45-a7c2-1d65009f2788" (UID: "420186eb-4b69-4d45-a7c2-1d65009f2788"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.581021 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-config-data" (OuterVolumeSpecName: "config-data") pod "420186eb-4b69-4d45-a7c2-1d65009f2788" (UID: "420186eb-4b69-4d45-a7c2-1d65009f2788"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.590771 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "420186eb-4b69-4d45-a7c2-1d65009f2788" (UID: "420186eb-4b69-4d45-a7c2-1d65009f2788"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.607211 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.607241 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/420186eb-4b69-4d45-a7c2-1d65009f2788-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.607251 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fd8f\" (UniqueName: \"kubernetes.io/projected/420186eb-4b69-4d45-a7c2-1d65009f2788-kube-api-access-2fd8f\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.607264 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.607273 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.607308 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") on node \"crc\" " Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.607318 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/420186eb-4b69-4d45-a7c2-1d65009f2788-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.652245 4756 generic.go:334] "Generic (PLEG): container finished" podID="420186eb-4b69-4d45-a7c2-1d65009f2788" containerID="674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2" exitCode=143 Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.652298 4756 generic.go:334] "Generic (PLEG): container finished" podID="420186eb-4b69-4d45-a7c2-1d65009f2788" containerID="1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579" exitCode=143 Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.652307 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.652359 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"420186eb-4b69-4d45-a7c2-1d65009f2788","Type":"ContainerDied","Data":"674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2"} Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.652393 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"420186eb-4b69-4d45-a7c2-1d65009f2788","Type":"ContainerDied","Data":"1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579"} Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.652406 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"420186eb-4b69-4d45-a7c2-1d65009f2788","Type":"ContainerDied","Data":"431f52071baf78fee04f2fa1ef5c785116bbd2930f7addcf436df2100e403ca5"} Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.652421 4756 scope.go:117] "RemoveContainer" containerID="674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.662045 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"147bdf88-f75b-400a-9150-794ad2ebdb6f","Type":"ContainerStarted","Data":"cd49b53d40c0c7d3d96695e99cec5c9170314f6655071dad0505b8a1951aadb4"} Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.662079 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="147bdf88-f75b-400a-9150-794ad2ebdb6f" containerName="glance-log" containerID="cri-o://a9ea09330adb62340da11e5e2b97ff53150cd2548bd08eed758cb633aa7e222a" gracePeriod=30 Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.662208 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="147bdf88-f75b-400a-9150-794ad2ebdb6f" containerName="glance-httpd" containerID="cri-o://cd49b53d40c0c7d3d96695e99cec5c9170314f6655071dad0505b8a1951aadb4" gracePeriod=30 Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.691064 4756 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.691243 4756 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-565e4be1-2f3b-410a-808c-677fff515f0b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b") on node "crc" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.701086 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.701057531 podStartE2EDuration="6.701057531s" podCreationTimestamp="2026-03-18 14:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:21:35.691476641 +0000 UTC m=+1297.005894616" watchObservedRunningTime="2026-03-18 14:21:35.701057531 +0000 UTC m=+1297.015475506" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.709107 4756 reconciler_common.go:293] "Volume detached for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.717881 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.730715 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.740040 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:21:35 crc kubenswrapper[4756]: E0318 14:21:35.740472 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="420186eb-4b69-4d45-a7c2-1d65009f2788" containerName="glance-httpd" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.740492 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="420186eb-4b69-4d45-a7c2-1d65009f2788" containerName="glance-httpd" Mar 18 14:21:35 crc kubenswrapper[4756]: E0318 14:21:35.740508 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1726f323-2352-403b-91cd-87f37d02bbd6" containerName="init" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.740514 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1726f323-2352-403b-91cd-87f37d02bbd6" containerName="init" Mar 18 14:21:35 crc kubenswrapper[4756]: E0318 14:21:35.740536 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="420186eb-4b69-4d45-a7c2-1d65009f2788" containerName="glance-log" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.740541 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="420186eb-4b69-4d45-a7c2-1d65009f2788" containerName="glance-log" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.740712 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="420186eb-4b69-4d45-a7c2-1d65009f2788" containerName="glance-httpd" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.740733 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1726f323-2352-403b-91cd-87f37d02bbd6" containerName="init" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.740750 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="420186eb-4b69-4d45-a7c2-1d65009f2788" containerName="glance-log" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.741709 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.744842 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.745028 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.753871 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.914788 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.914859 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/660c0c11-4c3e-465d-a8e8-181dfda9f400-logs\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.914886 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdz87\" (UniqueName: \"kubernetes.io/projected/660c0c11-4c3e-465d-a8e8-181dfda9f400-kube-api-access-qdz87\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.914921 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-config-data\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.914950 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/660c0c11-4c3e-465d-a8e8-181dfda9f400-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.914970 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.914989 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-scripts\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:35 crc kubenswrapper[4756]: I0318 14:21:35.915026 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.016653 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-config-data\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.017066 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/660c0c11-4c3e-465d-a8e8-181dfda9f400-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.017094 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.017131 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-scripts\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.017191 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.017256 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.017317 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/660c0c11-4c3e-465d-a8e8-181dfda9f400-logs\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.017340 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdz87\" (UniqueName: \"kubernetes.io/projected/660c0c11-4c3e-465d-a8e8-181dfda9f400-kube-api-access-qdz87\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.017702 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/660c0c11-4c3e-465d-a8e8-181dfda9f400-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.018427 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/660c0c11-4c3e-465d-a8e8-181dfda9f400-logs\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.022309 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.022719 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-config-data\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.022882 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-scripts\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.025732 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.026764 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.026792 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/14728f060a3b5046b333048907b438ee0376fa68800afec942964a27fea1d4a8/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.034993 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdz87\" (UniqueName: \"kubernetes.io/projected/660c0c11-4c3e-465d-a8e8-181dfda9f400-kube-api-access-qdz87\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.062829 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"glance-default-internal-api-0\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.066631 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.673311 4756 generic.go:334] "Generic (PLEG): container finished" podID="cca843b7-8858-4033-8d07-5de75be06ce4" containerID="4daeea50cec98c5deb566ebece1d6febddb5cded68a2ecf68c2ff6f17ca9ff3c" exitCode=0 Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.673371 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vtxsp" event={"ID":"cca843b7-8858-4033-8d07-5de75be06ce4","Type":"ContainerDied","Data":"4daeea50cec98c5deb566ebece1d6febddb5cded68a2ecf68c2ff6f17ca9ff3c"} Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.675374 4756 generic.go:334] "Generic (PLEG): container finished" podID="147bdf88-f75b-400a-9150-794ad2ebdb6f" containerID="cd49b53d40c0c7d3d96695e99cec5c9170314f6655071dad0505b8a1951aadb4" exitCode=0 Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.675411 4756 generic.go:334] "Generic (PLEG): container finished" podID="147bdf88-f75b-400a-9150-794ad2ebdb6f" containerID="a9ea09330adb62340da11e5e2b97ff53150cd2548bd08eed758cb633aa7e222a" exitCode=143 Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.675451 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"147bdf88-f75b-400a-9150-794ad2ebdb6f","Type":"ContainerDied","Data":"cd49b53d40c0c7d3d96695e99cec5c9170314f6655071dad0505b8a1951aadb4"} Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.675529 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"147bdf88-f75b-400a-9150-794ad2ebdb6f","Type":"ContainerDied","Data":"a9ea09330adb62340da11e5e2b97ff53150cd2548bd08eed758cb633aa7e222a"} Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.915099 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:21:36 crc kubenswrapper[4756]: I0318 14:21:36.915407 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:21:37 crc kubenswrapper[4756]: I0318 14:21:37.331286 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="420186eb-4b69-4d45-a7c2-1d65009f2788" path="/var/lib/kubelet/pods/420186eb-4b69-4d45-a7c2-1d65009f2788/volumes" Mar 18 14:21:40 crc kubenswrapper[4756]: I0318 14:21:40.776557 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:21:40 crc kubenswrapper[4756]: I0318 14:21:40.841921 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-bx844"] Mar 18 14:21:40 crc kubenswrapper[4756]: I0318 14:21:40.854720 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" podUID="c569a795-118f-49c7-850d-798474e0b461" containerName="dnsmasq-dns" containerID="cri-o://3b6bf8cab0f4a060d8b68d1b475d2350915f50ed8aa12154268eba9fa4a5b39f" gracePeriod=10 Mar 18 14:21:41 crc kubenswrapper[4756]: I0318 14:21:41.764512 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" podUID="c569a795-118f-49c7-850d-798474e0b461" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: connect: connection refused" Mar 18 14:21:41 crc kubenswrapper[4756]: I0318 14:21:41.765090 4756 generic.go:334] "Generic (PLEG): container finished" podID="c569a795-118f-49c7-850d-798474e0b461" containerID="3b6bf8cab0f4a060d8b68d1b475d2350915f50ed8aa12154268eba9fa4a5b39f" exitCode=0 Mar 18 14:21:41 crc kubenswrapper[4756]: I0318 14:21:41.765152 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" event={"ID":"c569a795-118f-49c7-850d-798474e0b461","Type":"ContainerDied","Data":"3b6bf8cab0f4a060d8b68d1b475d2350915f50ed8aa12154268eba9fa4a5b39f"} Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.304439 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.462035 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-public-tls-certs\") pod \"147bdf88-f75b-400a-9150-794ad2ebdb6f\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.462098 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/147bdf88-f75b-400a-9150-794ad2ebdb6f-httpd-run\") pod \"147bdf88-f75b-400a-9150-794ad2ebdb6f\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.462169 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-combined-ca-bundle\") pod \"147bdf88-f75b-400a-9150-794ad2ebdb6f\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.462216 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-config-data\") pod \"147bdf88-f75b-400a-9150-794ad2ebdb6f\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.462248 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq7dl\" (UniqueName: \"kubernetes.io/projected/147bdf88-f75b-400a-9150-794ad2ebdb6f-kube-api-access-qq7dl\") pod \"147bdf88-f75b-400a-9150-794ad2ebdb6f\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.462546 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"147bdf88-f75b-400a-9150-794ad2ebdb6f\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.462607 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/147bdf88-f75b-400a-9150-794ad2ebdb6f-logs\") pod \"147bdf88-f75b-400a-9150-794ad2ebdb6f\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.462695 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-scripts\") pod \"147bdf88-f75b-400a-9150-794ad2ebdb6f\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.464268 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/147bdf88-f75b-400a-9150-794ad2ebdb6f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "147bdf88-f75b-400a-9150-794ad2ebdb6f" (UID: "147bdf88-f75b-400a-9150-794ad2ebdb6f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.465412 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/147bdf88-f75b-400a-9150-794ad2ebdb6f-logs" (OuterVolumeSpecName: "logs") pod "147bdf88-f75b-400a-9150-794ad2ebdb6f" (UID: "147bdf88-f75b-400a-9150-794ad2ebdb6f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.470693 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-scripts" (OuterVolumeSpecName: "scripts") pod "147bdf88-f75b-400a-9150-794ad2ebdb6f" (UID: "147bdf88-f75b-400a-9150-794ad2ebdb6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.470854 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/147bdf88-f75b-400a-9150-794ad2ebdb6f-kube-api-access-qq7dl" (OuterVolumeSpecName: "kube-api-access-qq7dl") pod "147bdf88-f75b-400a-9150-794ad2ebdb6f" (UID: "147bdf88-f75b-400a-9150-794ad2ebdb6f"). InnerVolumeSpecName "kube-api-access-qq7dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.504391 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "147bdf88-f75b-400a-9150-794ad2ebdb6f" (UID: "147bdf88-f75b-400a-9150-794ad2ebdb6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:42 crc kubenswrapper[4756]: E0318 14:21:42.521458 4756 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817 podName:147bdf88-f75b-400a-9150-794ad2ebdb6f nodeName:}" failed. No retries permitted until 2026-03-18 14:21:43.021427476 +0000 UTC m=+1304.335845451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "glance" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817") pod "147bdf88-f75b-400a-9150-794ad2ebdb6f" (UID: "147bdf88-f75b-400a-9150-794ad2ebdb6f") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.524979 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "147bdf88-f75b-400a-9150-794ad2ebdb6f" (UID: "147bdf88-f75b-400a-9150-794ad2ebdb6f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.545283 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-config-data" (OuterVolumeSpecName: "config-data") pod "147bdf88-f75b-400a-9150-794ad2ebdb6f" (UID: "147bdf88-f75b-400a-9150-794ad2ebdb6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.565776 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.565813 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.565825 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq7dl\" (UniqueName: \"kubernetes.io/projected/147bdf88-f75b-400a-9150-794ad2ebdb6f-kube-api-access-qq7dl\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.565839 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/147bdf88-f75b-400a-9150-794ad2ebdb6f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.565850 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.565861 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/147bdf88-f75b-400a-9150-794ad2ebdb6f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.565872 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/147bdf88-f75b-400a-9150-794ad2ebdb6f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.775047 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"147bdf88-f75b-400a-9150-794ad2ebdb6f","Type":"ContainerDied","Data":"c31b7f037ea289ef586d49f0772d50a6b40d66bb4e8ee8ee12638d5fc78ced68"} Mar 18 14:21:42 crc kubenswrapper[4756]: I0318 14:21:42.775140 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.079988 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"147bdf88-f75b-400a-9150-794ad2ebdb6f\" (UID: \"147bdf88-f75b-400a-9150-794ad2ebdb6f\") " Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.100666 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817" (OuterVolumeSpecName: "glance") pod "147bdf88-f75b-400a-9150-794ad2ebdb6f" (UID: "147bdf88-f75b-400a-9150-794ad2ebdb6f"). InnerVolumeSpecName "pvc-d44f36dc-e387-43e2-913e-de408349f817". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.183039 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") on node \"crc\" " Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.209423 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.221252 4756 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.221432 4756 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d44f36dc-e387-43e2-913e-de408349f817" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817") on node "crc" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.238169 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.250408 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:21:43 crc kubenswrapper[4756]: E0318 14:21:43.250855 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147bdf88-f75b-400a-9150-794ad2ebdb6f" containerName="glance-log" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.250872 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="147bdf88-f75b-400a-9150-794ad2ebdb6f" containerName="glance-log" Mar 18 14:21:43 crc kubenswrapper[4756]: E0318 14:21:43.250887 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147bdf88-f75b-400a-9150-794ad2ebdb6f" containerName="glance-httpd" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.250895 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="147bdf88-f75b-400a-9150-794ad2ebdb6f" containerName="glance-httpd" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.251088 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="147bdf88-f75b-400a-9150-794ad2ebdb6f" containerName="glance-httpd" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.251113 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="147bdf88-f75b-400a-9150-794ad2ebdb6f" containerName="glance-log" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.252582 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.256151 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.257276 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.267475 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.284954 4756 reconciler_common.go:293] "Volume detached for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.333481 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="147bdf88-f75b-400a-9150-794ad2ebdb6f" path="/var/lib/kubelet/pods/147bdf88-f75b-400a-9150-794ad2ebdb6f/volumes" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.386514 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.386576 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.386605 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bqct\" (UniqueName: \"kubernetes.io/projected/2f752fe8-a008-4737-873f-0ae42990431f-kube-api-access-8bqct\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.386655 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-config-data\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.386676 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.386691 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-scripts\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.386761 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f752fe8-a008-4737-873f-0ae42990431f-logs\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.386787 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f752fe8-a008-4737-873f-0ae42990431f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.488232 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.488293 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.488324 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bqct\" (UniqueName: \"kubernetes.io/projected/2f752fe8-a008-4737-873f-0ae42990431f-kube-api-access-8bqct\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.488380 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-config-data\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.488401 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.488417 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-scripts\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.488460 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f752fe8-a008-4737-873f-0ae42990431f-logs\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.488483 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f752fe8-a008-4737-873f-0ae42990431f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.489361 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f752fe8-a008-4737-873f-0ae42990431f-logs\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.489433 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f752fe8-a008-4737-873f-0ae42990431f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.491464 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.491500 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2d16b602ee6d280a584346f693cabe12df6f04b4d4e7d81a050a495635db21be/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.493651 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.493763 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.495941 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-config-data\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.496521 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-scripts\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.514766 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bqct\" (UniqueName: \"kubernetes.io/projected/2f752fe8-a008-4737-873f-0ae42990431f-kube-api-access-8bqct\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.532177 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"glance-default-external-api-0\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " pod="openstack/glance-default-external-api-0" Mar 18 14:21:43 crc kubenswrapper[4756]: I0318 14:21:43.574206 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 14:21:46 crc kubenswrapper[4756]: I0318 14:21:46.764226 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" podUID="c569a795-118f-49c7-850d-798474e0b461" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: connect: connection refused" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.112027 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.184992 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-combined-ca-bundle\") pod \"cca843b7-8858-4033-8d07-5de75be06ce4\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.185067 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-fernet-keys\") pod \"cca843b7-8858-4033-8d07-5de75be06ce4\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.185189 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-config-data\") pod \"cca843b7-8858-4033-8d07-5de75be06ce4\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.185330 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-scripts\") pod \"cca843b7-8858-4033-8d07-5de75be06ce4\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.185370 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-credential-keys\") pod \"cca843b7-8858-4033-8d07-5de75be06ce4\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.185442 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjgwr\" (UniqueName: \"kubernetes.io/projected/cca843b7-8858-4033-8d07-5de75be06ce4-kube-api-access-qjgwr\") pod \"cca843b7-8858-4033-8d07-5de75be06ce4\" (UID: \"cca843b7-8858-4033-8d07-5de75be06ce4\") " Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.195272 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca843b7-8858-4033-8d07-5de75be06ce4-kube-api-access-qjgwr" (OuterVolumeSpecName: "kube-api-access-qjgwr") pod "cca843b7-8858-4033-8d07-5de75be06ce4" (UID: "cca843b7-8858-4033-8d07-5de75be06ce4"). InnerVolumeSpecName "kube-api-access-qjgwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.200293 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cca843b7-8858-4033-8d07-5de75be06ce4" (UID: "cca843b7-8858-4033-8d07-5de75be06ce4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.202480 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cca843b7-8858-4033-8d07-5de75be06ce4" (UID: "cca843b7-8858-4033-8d07-5de75be06ce4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.213297 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-scripts" (OuterVolumeSpecName: "scripts") pod "cca843b7-8858-4033-8d07-5de75be06ce4" (UID: "cca843b7-8858-4033-8d07-5de75be06ce4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.224298 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-config-data" (OuterVolumeSpecName: "config-data") pod "cca843b7-8858-4033-8d07-5de75be06ce4" (UID: "cca843b7-8858-4033-8d07-5de75be06ce4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.232477 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cca843b7-8858-4033-8d07-5de75be06ce4" (UID: "cca843b7-8858-4033-8d07-5de75be06ce4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.287723 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjgwr\" (UniqueName: \"kubernetes.io/projected/cca843b7-8858-4033-8d07-5de75be06ce4-kube-api-access-qjgwr\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.287786 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.287796 4756 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.287806 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.287814 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.287821 4756 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cca843b7-8858-4033-8d07-5de75be06ce4-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.840071 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vtxsp" event={"ID":"cca843b7-8858-4033-8d07-5de75be06ce4","Type":"ContainerDied","Data":"3d2f533ab7ba2c9a4cac9f891377d6b3c286c21ec9555ae3b514f1e5e564eae0"} Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.840427 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d2f533ab7ba2c9a4cac9f891377d6b3c286c21ec9555ae3b514f1e5e564eae0" Mar 18 14:21:48 crc kubenswrapper[4756]: I0318 14:21:48.840193 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vtxsp" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.208468 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vtxsp"] Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.216314 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vtxsp"] Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.312611 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fqmsw"] Mar 18 14:21:49 crc kubenswrapper[4756]: E0318 14:21:49.313017 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca843b7-8858-4033-8d07-5de75be06ce4" containerName="keystone-bootstrap" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.313036 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca843b7-8858-4033-8d07-5de75be06ce4" containerName="keystone-bootstrap" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.313264 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca843b7-8858-4033-8d07-5de75be06ce4" containerName="keystone-bootstrap" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.313965 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.317306 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.317728 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.317735 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.318611 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jstgl" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.327372 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca843b7-8858-4033-8d07-5de75be06ce4" path="/var/lib/kubelet/pods/cca843b7-8858-4033-8d07-5de75be06ce4/volumes" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.327883 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fqmsw"] Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.413861 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-combined-ca-bundle\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.413908 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2n59\" (UniqueName: \"kubernetes.io/projected/0cac161e-ec74-4621-801b-ad39634336d0-kube-api-access-s2n59\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.414504 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-scripts\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.414580 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-config-data\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.414631 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-credential-keys\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.414737 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-fernet-keys\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.516735 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-combined-ca-bundle\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.516812 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2n59\" (UniqueName: \"kubernetes.io/projected/0cac161e-ec74-4621-801b-ad39634336d0-kube-api-access-s2n59\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.517041 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-scripts\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.517076 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-config-data\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.517170 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-credential-keys\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.517237 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-fernet-keys\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.523500 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-fernet-keys\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.524056 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-credential-keys\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.524577 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-combined-ca-bundle\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.525180 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-config-data\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.528360 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-scripts\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.533876 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2n59\" (UniqueName: \"kubernetes.io/projected/0cac161e-ec74-4621-801b-ad39634336d0-kube-api-access-s2n59\") pod \"keystone-bootstrap-fqmsw\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:49 crc kubenswrapper[4756]: I0318 14:21:49.635755 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:21:51 crc kubenswrapper[4756]: I0318 14:21:51.871166 4756 generic.go:334] "Generic (PLEG): container finished" podID="4bb78a1f-b53a-4b17-8b91-a70da1bb9071" containerID="b02d67eec55be30003a0e76f6969294b0e486d32472e917f8959eec771490a55" exitCode=0 Mar 18 14:21:51 crc kubenswrapper[4756]: I0318 14:21:51.871227 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z7pth" event={"ID":"4bb78a1f-b53a-4b17-8b91-a70da1bb9071","Type":"ContainerDied","Data":"b02d67eec55be30003a0e76f6969294b0e486d32472e917f8959eec771490a55"} Mar 18 14:21:56 crc kubenswrapper[4756]: I0318 14:21:56.765177 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" podUID="c569a795-118f-49c7-850d-798474e0b461" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: i/o timeout" Mar 18 14:21:56 crc kubenswrapper[4756]: I0318 14:21:56.766321 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.112068 4756 scope.go:117] "RemoveContainer" containerID="1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.680923 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.689661 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z7pth" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.820609 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-dns-swift-storage-0\") pod \"c569a795-118f-49c7-850d-798474e0b461\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.820972 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-ovsdbserver-nb\") pod \"c569a795-118f-49c7-850d-798474e0b461\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.821007 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-ovsdbserver-sb\") pod \"c569a795-118f-49c7-850d-798474e0b461\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.821114 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-dns-svc\") pod \"c569a795-118f-49c7-850d-798474e0b461\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.821197 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d74jm\" (UniqueName: \"kubernetes.io/projected/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-kube-api-access-d74jm\") pod \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\" (UID: \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\") " Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.821258 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d25d\" (UniqueName: \"kubernetes.io/projected/c569a795-118f-49c7-850d-798474e0b461-kube-api-access-5d25d\") pod \"c569a795-118f-49c7-850d-798474e0b461\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.821336 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-config\") pod \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\" (UID: \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\") " Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.821467 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-config\") pod \"c569a795-118f-49c7-850d-798474e0b461\" (UID: \"c569a795-118f-49c7-850d-798474e0b461\") " Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.821501 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-combined-ca-bundle\") pod \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\" (UID: \"4bb78a1f-b53a-4b17-8b91-a70da1bb9071\") " Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.842643 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-kube-api-access-d74jm" (OuterVolumeSpecName: "kube-api-access-d74jm") pod "4bb78a1f-b53a-4b17-8b91-a70da1bb9071" (UID: "4bb78a1f-b53a-4b17-8b91-a70da1bb9071"). InnerVolumeSpecName "kube-api-access-d74jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.843245 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c569a795-118f-49c7-850d-798474e0b461-kube-api-access-5d25d" (OuterVolumeSpecName: "kube-api-access-5d25d") pod "c569a795-118f-49c7-850d-798474e0b461" (UID: "c569a795-118f-49c7-850d-798474e0b461"). InnerVolumeSpecName "kube-api-access-5d25d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.847760 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-config" (OuterVolumeSpecName: "config") pod "4bb78a1f-b53a-4b17-8b91-a70da1bb9071" (UID: "4bb78a1f-b53a-4b17-8b91-a70da1bb9071"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.848572 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bb78a1f-b53a-4b17-8b91-a70da1bb9071" (UID: "4bb78a1f-b53a-4b17-8b91-a70da1bb9071"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.867694 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c569a795-118f-49c7-850d-798474e0b461" (UID: "c569a795-118f-49c7-850d-798474e0b461"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.875665 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c569a795-118f-49c7-850d-798474e0b461" (UID: "c569a795-118f-49c7-850d-798474e0b461"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.888245 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-config" (OuterVolumeSpecName: "config") pod "c569a795-118f-49c7-850d-798474e0b461" (UID: "c569a795-118f-49c7-850d-798474e0b461"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.890639 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c569a795-118f-49c7-850d-798474e0b461" (UID: "c569a795-118f-49c7-850d-798474e0b461"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.901616 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c569a795-118f-49c7-850d-798474e0b461" (UID: "c569a795-118f-49c7-850d-798474e0b461"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.923677 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.923712 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.923725 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.923735 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.923745 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.923754 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c569a795-118f-49c7-850d-798474e0b461-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.923763 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d74jm\" (UniqueName: \"kubernetes.io/projected/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-kube-api-access-d74jm\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.923771 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d25d\" (UniqueName: \"kubernetes.io/projected/c569a795-118f-49c7-850d-798474e0b461-kube-api-access-5d25d\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.923781 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bb78a1f-b53a-4b17-8b91-a70da1bb9071-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.961359 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z7pth" event={"ID":"4bb78a1f-b53a-4b17-8b91-a70da1bb9071","Type":"ContainerDied","Data":"58fd95c30e59944d6780108b9a99f78ddc43c3e77d7746525d269764e53ac6bb"} Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.961420 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58fd95c30e59944d6780108b9a99f78ddc43c3e77d7746525d269764e53ac6bb" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.961487 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z7pth" Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.967883 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" event={"ID":"c569a795-118f-49c7-850d-798474e0b461","Type":"ContainerDied","Data":"8149f0290a3789951e83ee56114fd62d6b65b8857e52fd0a4402e209458c23f5"} Mar 18 14:21:58 crc kubenswrapper[4756]: I0318 14:21:58.967961 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.012861 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-bx844"] Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.023806 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-bx844"] Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.351924 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c569a795-118f-49c7-850d-798474e0b461" path="/var/lib/kubelet/pods/c569a795-118f-49c7-850d-798474e0b461/volumes" Mar 18 14:21:59 crc kubenswrapper[4756]: E0318 14:21:59.751976 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 18 14:21:59 crc kubenswrapper[4756]: E0318 14:21:59.752516 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26xbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-rpx8m_openstack(9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 14:21:59 crc kubenswrapper[4756]: E0318 14:21:59.756228 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-rpx8m" podUID="9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.905727 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tnw7j"] Mar 18 14:21:59 crc kubenswrapper[4756]: E0318 14:21:59.906456 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c569a795-118f-49c7-850d-798474e0b461" containerName="dnsmasq-dns" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.906468 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c569a795-118f-49c7-850d-798474e0b461" containerName="dnsmasq-dns" Mar 18 14:21:59 crc kubenswrapper[4756]: E0318 14:21:59.906486 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c569a795-118f-49c7-850d-798474e0b461" containerName="init" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.906491 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c569a795-118f-49c7-850d-798474e0b461" containerName="init" Mar 18 14:21:59 crc kubenswrapper[4756]: E0318 14:21:59.906507 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb78a1f-b53a-4b17-8b91-a70da1bb9071" containerName="neutron-db-sync" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.906852 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb78a1f-b53a-4b17-8b91-a70da1bb9071" containerName="neutron-db-sync" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.907019 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c569a795-118f-49c7-850d-798474e0b461" containerName="dnsmasq-dns" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.907034 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb78a1f-b53a-4b17-8b91-a70da1bb9071" containerName="neutron-db-sync" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.907984 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.946865 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tnw7j"] Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.948744 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-dns-svc\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.948806 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.948831 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.948847 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-config\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.948896 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:21:59 crc kubenswrapper[4756]: I0318 14:21:59.948928 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6775\" (UniqueName: \"kubernetes.io/projected/92a17284-5f6e-4781-b315-374908f04a82-kube-api-access-f6775\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: E0318 14:22:00.003398 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-rpx8m" podUID="9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.031265 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56db767868-svvqr"] Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.033028 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.038639 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4vv6g" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.039036 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.039271 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.039403 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.046377 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56db767868-svvqr"] Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.054853 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6775\" (UniqueName: \"kubernetes.io/projected/92a17284-5f6e-4781-b315-374908f04a82-kube-api-access-f6775\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.055167 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-dns-svc\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.055255 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.055282 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.055313 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-config\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.055456 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.056817 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-dns-svc\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.057512 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.057637 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.057764 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.058075 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-config\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.078974 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6775\" (UniqueName: \"kubernetes.io/projected/92a17284-5f6e-4781-b315-374908f04a82-kube-api-access-f6775\") pod \"dnsmasq-dns-55f844cf75-tnw7j\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.137328 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564062-rj85m"] Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.138980 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564062-rj85m" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.141742 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.142076 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.142411 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.147552 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564062-rj85m"] Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.157160 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-combined-ca-bundle\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.157226 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-httpd-config\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.158981 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-config\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.159218 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-ovndb-tls-certs\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.159249 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjflb\" (UniqueName: \"kubernetes.io/projected/e7ade8da-71b8-402b-8b7c-d2b333ab31da-kube-api-access-kjflb\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.260832 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-ovndb-tls-certs\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.260872 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjflb\" (UniqueName: \"kubernetes.io/projected/e7ade8da-71b8-402b-8b7c-d2b333ab31da-kube-api-access-kjflb\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.260930 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-combined-ca-bundle\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.260969 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-httpd-config\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.261005 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dvfj\" (UniqueName: \"kubernetes.io/projected/5cc376c5-de8d-4577-913c-ba2ed9d1bc75-kube-api-access-6dvfj\") pod \"auto-csr-approver-29564062-rj85m\" (UID: \"5cc376c5-de8d-4577-913c-ba2ed9d1bc75\") " pod="openshift-infra/auto-csr-approver-29564062-rj85m" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.261036 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-config\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.274503 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-ovndb-tls-certs\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.280415 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-httpd-config\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.281385 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-config\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.283635 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjflb\" (UniqueName: \"kubernetes.io/projected/e7ade8da-71b8-402b-8b7c-d2b333ab31da-kube-api-access-kjflb\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.286811 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.298454 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-combined-ca-bundle\") pod \"neutron-56db767868-svvqr\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.362322 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dvfj\" (UniqueName: \"kubernetes.io/projected/5cc376c5-de8d-4577-913c-ba2ed9d1bc75-kube-api-access-6dvfj\") pod \"auto-csr-approver-29564062-rj85m\" (UID: \"5cc376c5-de8d-4577-913c-ba2ed9d1bc75\") " pod="openshift-infra/auto-csr-approver-29564062-rj85m" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.362822 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.382037 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dvfj\" (UniqueName: \"kubernetes.io/projected/5cc376c5-de8d-4577-913c-ba2ed9d1bc75-kube-api-access-6dvfj\") pod \"auto-csr-approver-29564062-rj85m\" (UID: \"5cc376c5-de8d-4577-913c-ba2ed9d1bc75\") " pod="openshift-infra/auto-csr-approver-29564062-rj85m" Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.422243 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:22:00 crc kubenswrapper[4756]: I0318 14:22:00.475849 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564062-rj85m" Mar 18 14:22:01 crc kubenswrapper[4756]: I0318 14:22:01.767278 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-bx844" podUID="c569a795-118f-49c7-850d-798474e0b461" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: i/o timeout" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.223367 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5c574c99c-tvw8d"] Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.225509 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.230336 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.230543 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.254467 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c574c99c-tvw8d"] Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.302419 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-combined-ca-bundle\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.302520 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-httpd-config\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.302542 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-public-tls-certs\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.302596 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-config\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.302623 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-ovndb-tls-certs\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.302752 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-internal-tls-certs\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.302856 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd2wn\" (UniqueName: \"kubernetes.io/projected/5c5360dd-3416-47bb-9e45-b8517121fd45-kube-api-access-kd2wn\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.405190 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-httpd-config\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.406719 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-public-tls-certs\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.406835 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-config\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.406868 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-ovndb-tls-certs\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.406924 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-internal-tls-certs\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.406956 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd2wn\" (UniqueName: \"kubernetes.io/projected/5c5360dd-3416-47bb-9e45-b8517121fd45-kube-api-access-kd2wn\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.407000 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-combined-ca-bundle\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.412367 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-httpd-config\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.412807 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-internal-tls-certs\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.412879 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-combined-ca-bundle\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.414053 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-ovndb-tls-certs\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.426557 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-config\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.426616 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-public-tls-certs\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.427403 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd2wn\" (UniqueName: \"kubernetes.io/projected/5c5360dd-3416-47bb-9e45-b8517121fd45-kube-api-access-kd2wn\") pod \"neutron-5c574c99c-tvw8d\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.557952 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.868569 4756 scope.go:117] "RemoveContainer" containerID="674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2" Mar 18 14:22:02 crc kubenswrapper[4756]: E0318 14:22:02.869590 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2\": container with ID starting with 674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2 not found: ID does not exist" containerID="674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.869634 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2"} err="failed to get container status \"674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2\": rpc error: code = NotFound desc = could not find container \"674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2\": container with ID starting with 674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2 not found: ID does not exist" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.869659 4756 scope.go:117] "RemoveContainer" containerID="1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579" Mar 18 14:22:02 crc kubenswrapper[4756]: E0318 14:22:02.870378 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579\": container with ID starting with 1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579 not found: ID does not exist" containerID="1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.870420 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579"} err="failed to get container status \"1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579\": rpc error: code = NotFound desc = could not find container \"1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579\": container with ID starting with 1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579 not found: ID does not exist" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.870449 4756 scope.go:117] "RemoveContainer" containerID="674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.870748 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2"} err="failed to get container status \"674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2\": rpc error: code = NotFound desc = could not find container \"674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2\": container with ID starting with 674a188f85c7a1f58b0c9fb581d239b8cbbaebdf7610dd58579dc67351c98ed2 not found: ID does not exist" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.870770 4756 scope.go:117] "RemoveContainer" containerID="1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.871098 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579"} err="failed to get container status \"1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579\": rpc error: code = NotFound desc = could not find container \"1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579\": container with ID starting with 1461e2ee5accc484aab2b5fa54039cb5809c366d349df0bb193436d8e18f9579 not found: ID does not exist" Mar 18 14:22:02 crc kubenswrapper[4756]: I0318 14:22:02.871211 4756 scope.go:117] "RemoveContainer" containerID="cd49b53d40c0c7d3d96695e99cec5c9170314f6655071dad0505b8a1951aadb4" Mar 18 14:22:02 crc kubenswrapper[4756]: W0318 14:22:02.876109 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod660c0c11_4c3e_465d_a8e8_181dfda9f400.slice/crio-19bc1c1c80f57b5ee27ce248eaf7ca8197002ae2e1830e0fd0a1e305aa00966d WatchSource:0}: Error finding container 19bc1c1c80f57b5ee27ce248eaf7ca8197002ae2e1830e0fd0a1e305aa00966d: Status 404 returned error can't find the container with id 19bc1c1c80f57b5ee27ce248eaf7ca8197002ae2e1830e0fd0a1e305aa00966d Mar 18 14:22:03 crc kubenswrapper[4756]: I0318 14:22:03.055457 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"660c0c11-4c3e-465d-a8e8-181dfda9f400","Type":"ContainerStarted","Data":"19bc1c1c80f57b5ee27ce248eaf7ca8197002ae2e1830e0fd0a1e305aa00966d"} Mar 18 14:22:05 crc kubenswrapper[4756]: I0318 14:22:05.366305 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fqmsw"] Mar 18 14:22:05 crc kubenswrapper[4756]: I0318 14:22:05.520619 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:22:05 crc kubenswrapper[4756]: I0318 14:22:05.562680 4756 scope.go:117] "RemoveContainer" containerID="a9ea09330adb62340da11e5e2b97ff53150cd2548bd08eed758cb633aa7e222a" Mar 18 14:22:05 crc kubenswrapper[4756]: E0318 14:22:05.599814 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Mar 18 14:22:05 crc kubenswrapper[4756]: E0318 14:22:05.600144 4756 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Mar 18 14:22:05 crc kubenswrapper[4756]: E0318 14:22:05.600286 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l8tcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-tpt8q_openstack(7401eed2-7f0c-4f80-932b-5bc2df6684f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 14:22:05 crc kubenswrapper[4756]: E0318 14:22:05.601570 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-tpt8q" podUID="7401eed2-7f0c-4f80-932b-5bc2df6684f8" Mar 18 14:22:05 crc kubenswrapper[4756]: W0318 14:22:05.601851 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cac161e_ec74_4621_801b_ad39634336d0.slice/crio-4455bb1dd0285e8cdec90d8cf0378a117b0bfcc92cfdc7e575c2bcdb20079821 WatchSource:0}: Error finding container 4455bb1dd0285e8cdec90d8cf0378a117b0bfcc92cfdc7e575c2bcdb20079821: Status 404 returned error can't find the container with id 4455bb1dd0285e8cdec90d8cf0378a117b0bfcc92cfdc7e575c2bcdb20079821 Mar 18 14:22:05 crc kubenswrapper[4756]: W0318 14:22:05.604660 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f752fe8_a008_4737_873f_0ae42990431f.slice/crio-a957f2c89f4b9b89ccb42396fa6020778c1d86289e6443a7d2175d0f1fbe4911 WatchSource:0}: Error finding container a957f2c89f4b9b89ccb42396fa6020778c1d86289e6443a7d2175d0f1fbe4911: Status 404 returned error can't find the container with id a957f2c89f4b9b89ccb42396fa6020778c1d86289e6443a7d2175d0f1fbe4911 Mar 18 14:22:05 crc kubenswrapper[4756]: I0318 14:22:05.651849 4756 scope.go:117] "RemoveContainer" containerID="3b6bf8cab0f4a060d8b68d1b475d2350915f50ed8aa12154268eba9fa4a5b39f" Mar 18 14:22:05 crc kubenswrapper[4756]: I0318 14:22:05.827746 4756 scope.go:117] "RemoveContainer" containerID="f50813530ac27105a5dd2da41073898334f1bc8e57d331b5c78e4bac0f358d14" Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.058466 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564062-rj85m"] Mar 18 14:22:06 crc kubenswrapper[4756]: W0318 14:22:06.102826 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cc376c5_de8d_4577_913c_ba2ed9d1bc75.slice/crio-8434f8d7364ac95053ef8e826a9936af8fc453a04998fd4e88b63a727abe4320 WatchSource:0}: Error finding container 8434f8d7364ac95053ef8e826a9936af8fc453a04998fd4e88b63a727abe4320: Status 404 returned error can't find the container with id 8434f8d7364ac95053ef8e826a9936af8fc453a04998fd4e88b63a727abe4320 Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.107287 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a791a6a-0e50-465a-90cf-e4af5bdc12de","Type":"ContainerStarted","Data":"6cd2f7cec824393628fb88948e4665aad058ef37c40b05f1165872d6909f1e3a"} Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.110618 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fqmsw" event={"ID":"0cac161e-ec74-4621-801b-ad39634336d0","Type":"ContainerStarted","Data":"68fa0a2f1098d2b8cc780fc58f60074582e008246aa9d6f7ccf5b94dfc2b5a57"} Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.110689 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fqmsw" event={"ID":"0cac161e-ec74-4621-801b-ad39634336d0","Type":"ContainerStarted","Data":"4455bb1dd0285e8cdec90d8cf0378a117b0bfcc92cfdc7e575c2bcdb20079821"} Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.113648 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ghbmc" event={"ID":"5d52c352-cfd7-4679-912b-11f753c7831f","Type":"ContainerStarted","Data":"7abf039d8bbcd752c853e9aa21a26341344b0b3ac137b315a4f245db38214005"} Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.122366 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f752fe8-a008-4737-873f-0ae42990431f","Type":"ContainerStarted","Data":"a957f2c89f4b9b89ccb42396fa6020778c1d86289e6443a7d2175d0f1fbe4911"} Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.125836 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tnw7j"] Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.126199 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lvdn6" event={"ID":"18b4eda4-6808-4165-8b7c-e24dc046467c","Type":"ContainerStarted","Data":"6d37e6b93023940efa98e04c99af04d9028ddaa782ed1c41577a4d170a3b9b6b"} Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.135587 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fqmsw" podStartSLOduration=17.135568081 podStartE2EDuration="17.135568081s" podCreationTimestamp="2026-03-18 14:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:06.127440971 +0000 UTC m=+1327.441858946" watchObservedRunningTime="2026-03-18 14:22:06.135568081 +0000 UTC m=+1327.449986056" Mar 18 14:22:06 crc kubenswrapper[4756]: E0318 14:22:06.135647 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-tpt8q" podUID="7401eed2-7f0c-4f80-932b-5bc2df6684f8" Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.153433 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-lvdn6" podStartSLOduration=8.070463096 podStartE2EDuration="36.153413783s" podCreationTimestamp="2026-03-18 14:21:30 +0000 UTC" firstStartedPulling="2026-03-18 14:21:31.615593833 +0000 UTC m=+1292.930011818" lastFinishedPulling="2026-03-18 14:21:59.69854453 +0000 UTC m=+1321.012962505" observedRunningTime="2026-03-18 14:22:06.144499062 +0000 UTC m=+1327.458917057" watchObservedRunningTime="2026-03-18 14:22:06.153413783 +0000 UTC m=+1327.467831758" Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.163730 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ghbmc" podStartSLOduration=7.727904505 podStartE2EDuration="36.163714062s" podCreationTimestamp="2026-03-18 14:21:30 +0000 UTC" firstStartedPulling="2026-03-18 14:21:31.262721442 +0000 UTC m=+1292.577139417" lastFinishedPulling="2026-03-18 14:21:59.698530999 +0000 UTC m=+1321.012948974" observedRunningTime="2026-03-18 14:22:06.156663521 +0000 UTC m=+1327.471081516" watchObservedRunningTime="2026-03-18 14:22:06.163714062 +0000 UTC m=+1327.478132037" Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.297429 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c574c99c-tvw8d"] Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.916671 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.916903 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:22:06 crc kubenswrapper[4756]: I0318 14:22:06.925216 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56db767868-svvqr"] Mar 18 14:22:07 crc kubenswrapper[4756]: I0318 14:22:07.160786 4756 generic.go:334] "Generic (PLEG): container finished" podID="92a17284-5f6e-4781-b315-374908f04a82" containerID="b76c8e507d83330b6e047742ff88e87f9d6ed89a4c571db8768e44f329878e5c" exitCode=0 Mar 18 14:22:07 crc kubenswrapper[4756]: I0318 14:22:07.160955 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" event={"ID":"92a17284-5f6e-4781-b315-374908f04a82","Type":"ContainerDied","Data":"b76c8e507d83330b6e047742ff88e87f9d6ed89a4c571db8768e44f329878e5c"} Mar 18 14:22:07 crc kubenswrapper[4756]: I0318 14:22:07.161239 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" event={"ID":"92a17284-5f6e-4781-b315-374908f04a82","Type":"ContainerStarted","Data":"8bb16264848ac922a3719f0e674b0491997b9d16d644a519ae3bc932454148f0"} Mar 18 14:22:07 crc kubenswrapper[4756]: I0318 14:22:07.168357 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564062-rj85m" event={"ID":"5cc376c5-de8d-4577-913c-ba2ed9d1bc75","Type":"ContainerStarted","Data":"8434f8d7364ac95053ef8e826a9936af8fc453a04998fd4e88b63a727abe4320"} Mar 18 14:22:07 crc kubenswrapper[4756]: I0318 14:22:07.177071 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56db767868-svvqr" event={"ID":"e7ade8da-71b8-402b-8b7c-d2b333ab31da","Type":"ContainerStarted","Data":"97c4db9b33d5d8ccd826f7cf87ccc52bc21efb116f44c0aa1b1d1d59f7c090cf"} Mar 18 14:22:07 crc kubenswrapper[4756]: I0318 14:22:07.189370 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c574c99c-tvw8d" event={"ID":"5c5360dd-3416-47bb-9e45-b8517121fd45","Type":"ContainerStarted","Data":"5f31b8b654771aa8604a532288d6818a726c175229175cbaafe4ea4b741b7df0"} Mar 18 14:22:07 crc kubenswrapper[4756]: I0318 14:22:07.189409 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c574c99c-tvw8d" event={"ID":"5c5360dd-3416-47bb-9e45-b8517121fd45","Type":"ContainerStarted","Data":"9a9e9240c04440aab63f8bb3a1f0c3b56e111402c99293eab5524b0c7fbcd6d1"} Mar 18 14:22:07 crc kubenswrapper[4756]: I0318 14:22:07.189421 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c574c99c-tvw8d" event={"ID":"5c5360dd-3416-47bb-9e45-b8517121fd45","Type":"ContainerStarted","Data":"420ff5ffda4199d48c2f06213082807802ad4f5b0b7ce9655413b4e28b676b25"} Mar 18 14:22:07 crc kubenswrapper[4756]: I0318 14:22:07.189545 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:07 crc kubenswrapper[4756]: I0318 14:22:07.193180 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"660c0c11-4c3e-465d-a8e8-181dfda9f400","Type":"ContainerStarted","Data":"566c759a70dbd0417353cd55e43283fb0045901dc489f48141cc19b958c7f10a"} Mar 18 14:22:07 crc kubenswrapper[4756]: I0318 14:22:07.200176 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f752fe8-a008-4737-873f-0ae42990431f","Type":"ContainerStarted","Data":"0b4362bbe23b3f2e2d554dc797b564555b1676795a8f7418c1957f3bec62f4bb"} Mar 18 14:22:07 crc kubenswrapper[4756]: I0318 14:22:07.214809 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5c574c99c-tvw8d" podStartSLOduration=5.214789981 podStartE2EDuration="5.214789981s" podCreationTimestamp="2026-03-18 14:22:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:07.209712174 +0000 UTC m=+1328.524130149" watchObservedRunningTime="2026-03-18 14:22:07.214789981 +0000 UTC m=+1328.529207946" Mar 18 14:22:08 crc kubenswrapper[4756]: I0318 14:22:08.217106 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f752fe8-a008-4737-873f-0ae42990431f","Type":"ContainerStarted","Data":"7f8c73a200ab1e9fbc6ec57df49d0fa4daf59882731c743ae02b406a84aa05dd"} Mar 18 14:22:08 crc kubenswrapper[4756]: I0318 14:22:08.225061 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56db767868-svvqr" event={"ID":"e7ade8da-71b8-402b-8b7c-d2b333ab31da","Type":"ContainerStarted","Data":"720b57841e6438cde31ac56c82d9321f65025de2d66a3d41576fa3ab95840aaa"} Mar 18 14:22:08 crc kubenswrapper[4756]: I0318 14:22:08.227983 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"660c0c11-4c3e-465d-a8e8-181dfda9f400","Type":"ContainerStarted","Data":"b434d48f08622062d240856c8ad009c0feff9187d0d10a7a8919ac644f65115a"} Mar 18 14:22:08 crc kubenswrapper[4756]: I0318 14:22:08.247677 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=25.24765981 podStartE2EDuration="25.24765981s" podCreationTimestamp="2026-03-18 14:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:08.235529581 +0000 UTC m=+1329.549947576" watchObservedRunningTime="2026-03-18 14:22:08.24765981 +0000 UTC m=+1329.562077785" Mar 18 14:22:08 crc kubenswrapper[4756]: I0318 14:22:08.261372 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=33.261349679 podStartE2EDuration="33.261349679s" podCreationTimestamp="2026-03-18 14:21:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:08.255370647 +0000 UTC m=+1329.569788622" watchObservedRunningTime="2026-03-18 14:22:08.261349679 +0000 UTC m=+1329.575767654" Mar 18 14:22:09 crc kubenswrapper[4756]: I0318 14:22:09.238451 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a791a6a-0e50-465a-90cf-e4af5bdc12de","Type":"ContainerStarted","Data":"7a34dbea7ba5ecf140e7fb7f99455f2180277a988725a32294c2b9450d52ab50"} Mar 18 14:22:09 crc kubenswrapper[4756]: I0318 14:22:09.240170 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" event={"ID":"92a17284-5f6e-4781-b315-374908f04a82","Type":"ContainerStarted","Data":"d3d30add7ba22b0a1eb3606cff1f24caea20033a1779f1a015e402c8a1c064a5"} Mar 18 14:22:09 crc kubenswrapper[4756]: I0318 14:22:09.240284 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:09 crc kubenswrapper[4756]: I0318 14:22:09.242383 4756 generic.go:334] "Generic (PLEG): container finished" podID="5cc376c5-de8d-4577-913c-ba2ed9d1bc75" containerID="f723e226feb9712ad1d42e121fc0cf04b0a08349c3222be90d04b7860a7d6695" exitCode=0 Mar 18 14:22:09 crc kubenswrapper[4756]: I0318 14:22:09.242443 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564062-rj85m" event={"ID":"5cc376c5-de8d-4577-913c-ba2ed9d1bc75","Type":"ContainerDied","Data":"f723e226feb9712ad1d42e121fc0cf04b0a08349c3222be90d04b7860a7d6695"} Mar 18 14:22:09 crc kubenswrapper[4756]: I0318 14:22:09.244258 4756 generic.go:334] "Generic (PLEG): container finished" podID="0cac161e-ec74-4621-801b-ad39634336d0" containerID="68fa0a2f1098d2b8cc780fc58f60074582e008246aa9d6f7ccf5b94dfc2b5a57" exitCode=0 Mar 18 14:22:09 crc kubenswrapper[4756]: I0318 14:22:09.244331 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fqmsw" event={"ID":"0cac161e-ec74-4621-801b-ad39634336d0","Type":"ContainerDied","Data":"68fa0a2f1098d2b8cc780fc58f60074582e008246aa9d6f7ccf5b94dfc2b5a57"} Mar 18 14:22:09 crc kubenswrapper[4756]: I0318 14:22:09.245770 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56db767868-svvqr" event={"ID":"e7ade8da-71b8-402b-8b7c-d2b333ab31da","Type":"ContainerStarted","Data":"c81c552cccbd366bde29189352ac386f636c21f02ae547e20dd8d50594724edd"} Mar 18 14:22:09 crc kubenswrapper[4756]: I0318 14:22:09.246614 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:09 crc kubenswrapper[4756]: I0318 14:22:09.247931 4756 generic.go:334] "Generic (PLEG): container finished" podID="18b4eda4-6808-4165-8b7c-e24dc046467c" containerID="6d37e6b93023940efa98e04c99af04d9028ddaa782ed1c41577a4d170a3b9b6b" exitCode=0 Mar 18 14:22:09 crc kubenswrapper[4756]: I0318 14:22:09.247996 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lvdn6" event={"ID":"18b4eda4-6808-4165-8b7c-e24dc046467c","Type":"ContainerDied","Data":"6d37e6b93023940efa98e04c99af04d9028ddaa782ed1c41577a4d170a3b9b6b"} Mar 18 14:22:09 crc kubenswrapper[4756]: I0318 14:22:09.271087 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" podStartSLOduration=10.271068951 podStartE2EDuration="10.271068951s" podCreationTimestamp="2026-03-18 14:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:09.261809241 +0000 UTC m=+1330.576227216" watchObservedRunningTime="2026-03-18 14:22:09.271068951 +0000 UTC m=+1330.585486926" Mar 18 14:22:09 crc kubenswrapper[4756]: I0318 14:22:09.287316 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56db767868-svvqr" podStartSLOduration=10.28729985 podStartE2EDuration="10.28729985s" podCreationTimestamp="2026-03-18 14:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:09.282964863 +0000 UTC m=+1330.597382828" watchObservedRunningTime="2026-03-18 14:22:09.28729985 +0000 UTC m=+1330.601717825" Mar 18 14:22:10 crc kubenswrapper[4756]: I0318 14:22:10.262106 4756 generic.go:334] "Generic (PLEG): container finished" podID="5d52c352-cfd7-4679-912b-11f753c7831f" containerID="7abf039d8bbcd752c853e9aa21a26341344b0b3ac137b315a4f245db38214005" exitCode=0 Mar 18 14:22:10 crc kubenswrapper[4756]: I0318 14:22:10.262149 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ghbmc" event={"ID":"5d52c352-cfd7-4679-912b-11f753c7831f","Type":"ContainerDied","Data":"7abf039d8bbcd752c853e9aa21a26341344b0b3ac137b315a4f245db38214005"} Mar 18 14:22:13 crc kubenswrapper[4756]: I0318 14:22:13.574823 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 14:22:13 crc kubenswrapper[4756]: I0318 14:22:13.575371 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 14:22:13 crc kubenswrapper[4756]: I0318 14:22:13.575383 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 14:22:13 crc kubenswrapper[4756]: I0318 14:22:13.575393 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 14:22:13 crc kubenswrapper[4756]: I0318 14:22:13.622580 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 14:22:13 crc kubenswrapper[4756]: I0318 14:22:13.631002 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.290938 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564062-rj85m" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.298600 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ghbmc" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.316895 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.332919 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lvdn6" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.363532 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lvdn6" event={"ID":"18b4eda4-6808-4165-8b7c-e24dc046467c","Type":"ContainerDied","Data":"36ebabe9a17cc18caafe7a68d265478cc861065857e7d788137afa03c454132d"} Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.363760 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36ebabe9a17cc18caafe7a68d265478cc861065857e7d788137afa03c454132d" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.363878 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lvdn6" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.366542 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564062-rj85m" event={"ID":"5cc376c5-de8d-4577-913c-ba2ed9d1bc75","Type":"ContainerDied","Data":"8434f8d7364ac95053ef8e826a9936af8fc453a04998fd4e88b63a727abe4320"} Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.366654 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8434f8d7364ac95053ef8e826a9936af8fc453a04998fd4e88b63a727abe4320" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.366781 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564062-rj85m" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.372495 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fqmsw" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.372476 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fqmsw" event={"ID":"0cac161e-ec74-4621-801b-ad39634336d0","Type":"ContainerDied","Data":"4455bb1dd0285e8cdec90d8cf0378a117b0bfcc92cfdc7e575c2bcdb20079821"} Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.372729 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4455bb1dd0285e8cdec90d8cf0378a117b0bfcc92cfdc7e575c2bcdb20079821" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.376459 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ghbmc" event={"ID":"5d52c352-cfd7-4679-912b-11f753c7831f","Type":"ContainerDied","Data":"eafbcd01aab50e5cb9c6faafbdea009f25c3a6455cec345adf6e1ceb52b7fffb"} Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.376552 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eafbcd01aab50e5cb9c6faafbdea009f25c3a6455cec345adf6e1ceb52b7fffb" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.376504 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ghbmc" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460445 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dvfj\" (UniqueName: \"kubernetes.io/projected/5cc376c5-de8d-4577-913c-ba2ed9d1bc75-kube-api-access-6dvfj\") pod \"5cc376c5-de8d-4577-913c-ba2ed9d1bc75\" (UID: \"5cc376c5-de8d-4577-913c-ba2ed9d1bc75\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460493 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-config-data\") pod \"18b4eda4-6808-4165-8b7c-e24dc046467c\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460547 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-credential-keys\") pod \"0cac161e-ec74-4621-801b-ad39634336d0\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460565 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-combined-ca-bundle\") pod \"18b4eda4-6808-4165-8b7c-e24dc046467c\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460649 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d52c352-cfd7-4679-912b-11f753c7831f-db-sync-config-data\") pod \"5d52c352-cfd7-4679-912b-11f753c7831f\" (UID: \"5d52c352-cfd7-4679-912b-11f753c7831f\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460672 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-scripts\") pod \"18b4eda4-6808-4165-8b7c-e24dc046467c\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460704 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjtth\" (UniqueName: \"kubernetes.io/projected/18b4eda4-6808-4165-8b7c-e24dc046467c-kube-api-access-jjtth\") pod \"18b4eda4-6808-4165-8b7c-e24dc046467c\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460745 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-config-data\") pod \"0cac161e-ec74-4621-801b-ad39634336d0\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460781 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d52c352-cfd7-4679-912b-11f753c7831f-combined-ca-bundle\") pod \"5d52c352-cfd7-4679-912b-11f753c7831f\" (UID: \"5d52c352-cfd7-4679-912b-11f753c7831f\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460812 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2n59\" (UniqueName: \"kubernetes.io/projected/0cac161e-ec74-4621-801b-ad39634336d0-kube-api-access-s2n59\") pod \"0cac161e-ec74-4621-801b-ad39634336d0\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460840 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-fernet-keys\") pod \"0cac161e-ec74-4621-801b-ad39634336d0\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460894 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18b4eda4-6808-4165-8b7c-e24dc046467c-logs\") pod \"18b4eda4-6808-4165-8b7c-e24dc046467c\" (UID: \"18b4eda4-6808-4165-8b7c-e24dc046467c\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460933 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bgxj\" (UniqueName: \"kubernetes.io/projected/5d52c352-cfd7-4679-912b-11f753c7831f-kube-api-access-8bgxj\") pod \"5d52c352-cfd7-4679-912b-11f753c7831f\" (UID: \"5d52c352-cfd7-4679-912b-11f753c7831f\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460949 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-scripts\") pod \"0cac161e-ec74-4621-801b-ad39634336d0\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.460975 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-combined-ca-bundle\") pod \"0cac161e-ec74-4621-801b-ad39634336d0\" (UID: \"0cac161e-ec74-4621-801b-ad39634336d0\") " Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.462314 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18b4eda4-6808-4165-8b7c-e24dc046467c-logs" (OuterVolumeSpecName: "logs") pod "18b4eda4-6808-4165-8b7c-e24dc046467c" (UID: "18b4eda4-6808-4165-8b7c-e24dc046467c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.466317 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cc376c5-de8d-4577-913c-ba2ed9d1bc75-kube-api-access-6dvfj" (OuterVolumeSpecName: "kube-api-access-6dvfj") pod "5cc376c5-de8d-4577-913c-ba2ed9d1bc75" (UID: "5cc376c5-de8d-4577-913c-ba2ed9d1bc75"). InnerVolumeSpecName "kube-api-access-6dvfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.467404 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0cac161e-ec74-4621-801b-ad39634336d0" (UID: "0cac161e-ec74-4621-801b-ad39634336d0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.468238 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0cac161e-ec74-4621-801b-ad39634336d0" (UID: "0cac161e-ec74-4621-801b-ad39634336d0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.469354 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d52c352-cfd7-4679-912b-11f753c7831f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5d52c352-cfd7-4679-912b-11f753c7831f" (UID: "5d52c352-cfd7-4679-912b-11f753c7831f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.469377 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d52c352-cfd7-4679-912b-11f753c7831f-kube-api-access-8bgxj" (OuterVolumeSpecName: "kube-api-access-8bgxj") pod "5d52c352-cfd7-4679-912b-11f753c7831f" (UID: "5d52c352-cfd7-4679-912b-11f753c7831f"). InnerVolumeSpecName "kube-api-access-8bgxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.470280 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-scripts" (OuterVolumeSpecName: "scripts") pod "18b4eda4-6808-4165-8b7c-e24dc046467c" (UID: "18b4eda4-6808-4165-8b7c-e24dc046467c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.473039 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b4eda4-6808-4165-8b7c-e24dc046467c-kube-api-access-jjtth" (OuterVolumeSpecName: "kube-api-access-jjtth") pod "18b4eda4-6808-4165-8b7c-e24dc046467c" (UID: "18b4eda4-6808-4165-8b7c-e24dc046467c"). InnerVolumeSpecName "kube-api-access-jjtth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.481318 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cac161e-ec74-4621-801b-ad39634336d0-kube-api-access-s2n59" (OuterVolumeSpecName: "kube-api-access-s2n59") pod "0cac161e-ec74-4621-801b-ad39634336d0" (UID: "0cac161e-ec74-4621-801b-ad39634336d0"). InnerVolumeSpecName "kube-api-access-s2n59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.482952 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-scripts" (OuterVolumeSpecName: "scripts") pod "0cac161e-ec74-4621-801b-ad39634336d0" (UID: "0cac161e-ec74-4621-801b-ad39634336d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.498306 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d52c352-cfd7-4679-912b-11f753c7831f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d52c352-cfd7-4679-912b-11f753c7831f" (UID: "5d52c352-cfd7-4679-912b-11f753c7831f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.507541 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cac161e-ec74-4621-801b-ad39634336d0" (UID: "0cac161e-ec74-4621-801b-ad39634336d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.513656 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-config-data" (OuterVolumeSpecName: "config-data") pod "18b4eda4-6808-4165-8b7c-e24dc046467c" (UID: "18b4eda4-6808-4165-8b7c-e24dc046467c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.520730 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-config-data" (OuterVolumeSpecName: "config-data") pod "0cac161e-ec74-4621-801b-ad39634336d0" (UID: "0cac161e-ec74-4621-801b-ad39634336d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.539382 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18b4eda4-6808-4165-8b7c-e24dc046467c" (UID: "18b4eda4-6808-4165-8b7c-e24dc046467c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563052 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563081 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bgxj\" (UniqueName: \"kubernetes.io/projected/5d52c352-cfd7-4679-912b-11f753c7831f-kube-api-access-8bgxj\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563092 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563101 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dvfj\" (UniqueName: \"kubernetes.io/projected/5cc376c5-de8d-4577-913c-ba2ed9d1bc75-kube-api-access-6dvfj\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563109 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563137 4756 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563147 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563161 4756 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d52c352-cfd7-4679-912b-11f753c7831f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563170 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18b4eda4-6808-4165-8b7c-e24dc046467c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563180 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjtth\" (UniqueName: \"kubernetes.io/projected/18b4eda4-6808-4165-8b7c-e24dc046467c-kube-api-access-jjtth\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563188 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563196 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d52c352-cfd7-4679-912b-11f753c7831f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563205 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2n59\" (UniqueName: \"kubernetes.io/projected/0cac161e-ec74-4621-801b-ad39634336d0-kube-api-access-s2n59\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563212 4756 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0cac161e-ec74-4621-801b-ad39634336d0-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:14 crc kubenswrapper[4756]: I0318 14:22:14.563220 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18b4eda4-6808-4165-8b7c-e24dc046467c-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.289445 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.398032 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a791a6a-0e50-465a-90cf-e4af5bdc12de","Type":"ContainerStarted","Data":"544736d183a3cbf62f7ad025b7e67c4b647a9fe7b073a0f3ad059d28034fcae6"} Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.401963 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rpx8m" event={"ID":"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e","Type":"ContainerStarted","Data":"d2d1fe77d93c754e6515368dd66601e0dd41859e0f19118ebdbf4cde6a187c19"} Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.428166 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-t4jg8"] Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.428400 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" podUID="2279186d-a6bf-4a99-a62f-6f1a0a405269" containerName="dnsmasq-dns" containerID="cri-o://8afca66c131ab3c8ef6705d5594492be692fd18cebc663ccbb37cae1ccb2b7ba" gracePeriod=10 Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.491712 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564056-4x5qk"] Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.503614 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-rpx8m" podStartSLOduration=3.55743522 podStartE2EDuration="46.503596613s" podCreationTimestamp="2026-03-18 14:21:29 +0000 UTC" firstStartedPulling="2026-03-18 14:21:31.15578097 +0000 UTC m=+1292.470198935" lastFinishedPulling="2026-03-18 14:22:14.101942353 +0000 UTC m=+1335.416360328" observedRunningTime="2026-03-18 14:22:15.428810331 +0000 UTC m=+1336.743228296" watchObservedRunningTime="2026-03-18 14:22:15.503596613 +0000 UTC m=+1336.818014588" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.505952 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564056-4x5qk"] Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.617267 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7f64d44848-xg692"] Mar 18 14:22:15 crc kubenswrapper[4756]: E0318 14:22:15.618131 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cac161e-ec74-4621-801b-ad39634336d0" containerName="keystone-bootstrap" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.618148 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cac161e-ec74-4621-801b-ad39634336d0" containerName="keystone-bootstrap" Mar 18 14:22:15 crc kubenswrapper[4756]: E0318 14:22:15.618184 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d52c352-cfd7-4679-912b-11f753c7831f" containerName="barbican-db-sync" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.618191 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d52c352-cfd7-4679-912b-11f753c7831f" containerName="barbican-db-sync" Mar 18 14:22:15 crc kubenswrapper[4756]: E0318 14:22:15.618226 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b4eda4-6808-4165-8b7c-e24dc046467c" containerName="placement-db-sync" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.618233 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b4eda4-6808-4165-8b7c-e24dc046467c" containerName="placement-db-sync" Mar 18 14:22:15 crc kubenswrapper[4756]: E0318 14:22:15.618252 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc376c5-de8d-4577-913c-ba2ed9d1bc75" containerName="oc" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.618258 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc376c5-de8d-4577-913c-ba2ed9d1bc75" containerName="oc" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.618606 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cc376c5-de8d-4577-913c-ba2ed9d1bc75" containerName="oc" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.618645 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cac161e-ec74-4621-801b-ad39634336d0" containerName="keystone-bootstrap" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.618662 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b4eda4-6808-4165-8b7c-e24dc046467c" containerName="placement-db-sync" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.618671 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d52c352-cfd7-4679-912b-11f753c7831f" containerName="barbican-db-sync" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.619595 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.626233 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-combined-ca-bundle\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.626282 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-internal-tls-certs\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.626382 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-credential-keys\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.626471 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-scripts\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.626548 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svh9v\" (UniqueName: \"kubernetes.io/projected/e0973f28-44a0-4f66-aadf-42187c9ced68-kube-api-access-svh9v\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.626591 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-fernet-keys\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.626630 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-config-data\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.626672 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-public-tls-certs\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.655738 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.655776 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.656095 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jstgl" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.656309 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.656567 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.656729 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.700599 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f64d44848-xg692"] Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.726457 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5f97847bb-hrplw"] Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.728060 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.729616 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-public-tls-certs\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.729708 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-combined-ca-bundle\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.729743 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-internal-tls-certs\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.729817 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-credential-keys\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.729895 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-scripts\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.729944 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svh9v\" (UniqueName: \"kubernetes.io/projected/e0973f28-44a0-4f66-aadf-42187c9ced68-kube-api-access-svh9v\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.729980 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-fernet-keys\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.730011 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-config-data\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.737913 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.738570 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.739687 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.740838 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.741833 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-cm9sh" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.749897 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-scripts\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.753926 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-credential-keys\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.755814 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-config-data\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.760872 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f97847bb-hrplw"] Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.769524 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-combined-ca-bundle\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.769929 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-internal-tls-certs\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.797766 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" podUID="2279186d-a6bf-4a99-a62f-6f1a0a405269" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.169:5353: connect: connection refused" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.800107 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-fernet-keys\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.800654 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0973f28-44a0-4f66-aadf-42187c9ced68-public-tls-certs\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.819809 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svh9v\" (UniqueName: \"kubernetes.io/projected/e0973f28-44a0-4f66-aadf-42187c9ced68-kube-api-access-svh9v\") pod \"keystone-7f64d44848-xg692\" (UID: \"e0973f28-44a0-4f66-aadf-42187c9ced68\") " pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.825196 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-788dd8d778-6jnls"] Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.827297 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.831254 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13d06afc-d8e7-40dd-b7ea-561655336f97-logs\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.831300 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d50193a-fddc-4dc4-a597-96d67e28a55b-logs\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.831318 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-config-data\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.831336 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-internal-tls-certs\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.831378 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-scripts\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.831408 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-public-tls-certs\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.831430 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-659zn\" (UniqueName: \"kubernetes.io/projected/8d50193a-fddc-4dc4-a597-96d67e28a55b-kube-api-access-659zn\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.831467 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6q6p\" (UniqueName: \"kubernetes.io/projected/13d06afc-d8e7-40dd-b7ea-561655336f97-kube-api-access-t6q6p\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.831501 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-config-data-custom\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.831545 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-config-data\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.831591 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-combined-ca-bundle\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.831639 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-combined-ca-bundle\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.851554 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.851736 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xdszx" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.851967 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.852486 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-65fcb6c8bc-k2qdb"] Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.854055 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.859550 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-788dd8d778-6jnls"] Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.863432 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.932590 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d1b57e3-2938-4c59-a081-487660fa5e9f-logs\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.932742 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-combined-ca-bundle\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.932826 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-combined-ca-bundle\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.932937 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13d06afc-d8e7-40dd-b7ea-561655336f97-logs\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.934831 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d50193a-fddc-4dc4-a597-96d67e28a55b-logs\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.934942 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-config-data\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.935041 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-internal-tls-certs\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.935181 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-scripts\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.935301 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-public-tls-certs\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.935392 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-659zn\" (UniqueName: \"kubernetes.io/projected/8d50193a-fddc-4dc4-a597-96d67e28a55b-kube-api-access-659zn\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.935485 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b26sv\" (UniqueName: \"kubernetes.io/projected/5d1b57e3-2938-4c59-a081-487660fa5e9f-kube-api-access-b26sv\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.938364 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-config-data-custom\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.938456 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6q6p\" (UniqueName: \"kubernetes.io/projected/13d06afc-d8e7-40dd-b7ea-561655336f97-kube-api-access-t6q6p\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.938546 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-config-data-custom\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.938621 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-combined-ca-bundle\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.938698 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-config-data\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.938812 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-config-data\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.944201 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13d06afc-d8e7-40dd-b7ea-561655336f97-logs\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.944950 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d50193a-fddc-4dc4-a597-96d67e28a55b-logs\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.965370 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65fcb6c8bc-k2qdb"] Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.966533 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-scripts\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.968020 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-combined-ca-bundle\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.969867 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-659zn\" (UniqueName: \"kubernetes.io/projected/8d50193a-fddc-4dc4-a597-96d67e28a55b-kube-api-access-659zn\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.982281 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-internal-tls-certs\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.984904 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6q6p\" (UniqueName: \"kubernetes.io/projected/13d06afc-d8e7-40dd-b7ea-561655336f97-kube-api-access-t6q6p\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.985049 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-config-data\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.985409 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-combined-ca-bundle\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.987633 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-config-data\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.988769 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-public-tls-certs\") pod \"placement-5f97847bb-hrplw\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:15 crc kubenswrapper[4756]: I0318 14:22:15.995617 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-config-data-custom\") pod \"barbican-keystone-listener-788dd8d778-6jnls\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.041057 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.044270 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b26sv\" (UniqueName: \"kubernetes.io/projected/5d1b57e3-2938-4c59-a081-487660fa5e9f-kube-api-access-b26sv\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.044316 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-config-data-custom\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.044508 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-combined-ca-bundle\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.044555 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-config-data\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.044649 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d1b57e3-2938-4c59-a081-487660fa5e9f-logs\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.045206 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d1b57e3-2938-4c59-a081-487660fa5e9f-logs\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.054240 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-config-data-custom\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.065567 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9sh7"] Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.068630 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.069423 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-config-data\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.069594 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.072358 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.075482 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-combined-ca-bundle\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.085454 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b26sv\" (UniqueName: \"kubernetes.io/projected/5d1b57e3-2938-4c59-a081-487660fa5e9f-kube-api-access-b26sv\") pod \"barbican-worker-65fcb6c8bc-k2qdb\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.095188 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9sh7"] Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.191279 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7995d6cd86-6kx6b"] Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.192887 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.192959 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.196035 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7995d6cd86-6kx6b"] Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.221389 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.222051 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6ccf458dc-bmbzj"] Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.223897 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.235510 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b67498b8b-xjbnx"] Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.237043 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.237147 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.247163 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.249749 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.249792 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkr8m\" (UniqueName: \"kubernetes.io/projected/a6759509-7c13-49a6-893a-86605058eabc-kube-api-access-vkr8m\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.249888 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-config\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.249920 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.249939 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.250268 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6ccf458dc-bmbzj"] Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.253659 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.257657 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.258559 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b67498b8b-xjbnx"] Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.279269 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.359520 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-combined-ca-bundle\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.359579 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-config-data\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.359675 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.359696 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkr8m\" (UniqueName: \"kubernetes.io/projected/a6759509-7c13-49a6-893a-86605058eabc-kube-api-access-vkr8m\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.359713 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7q9r\" (UniqueName: \"kubernetes.io/projected/7ad060be-35eb-4d9d-8a45-0a387009708c-kube-api-access-q7q9r\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.359850 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ad060be-35eb-4d9d-8a45-0a387009708c-config-data\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.359875 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-config\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.359894 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63030e19-cf53-4766-aeb1-e25be96a8652-logs\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.359909 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ad060be-35eb-4d9d-8a45-0a387009708c-logs\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.359958 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.363528 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.363609 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7sd8\" (UniqueName: \"kubernetes.io/projected/63030e19-cf53-4766-aeb1-e25be96a8652-kube-api-access-d7sd8\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.363644 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-config-data-custom\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.363680 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-logs\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.363736 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-combined-ca-bundle\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.363764 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-config-data-custom\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.363795 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-config-data\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.363858 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad060be-35eb-4d9d-8a45-0a387009708c-combined-ca-bundle\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.363893 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fprzk\" (UniqueName: \"kubernetes.io/projected/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-kube-api-access-fprzk\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.363918 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ad060be-35eb-4d9d-8a45-0a387009708c-config-data-custom\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.363940 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.364755 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.365334 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.366052 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-config\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.371973 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.394786 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkr8m\" (UniqueName: \"kubernetes.io/projected/a6759509-7c13-49a6-893a-86605058eabc-kube-api-access-vkr8m\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.407300 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m9sh7\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.416798 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.465492 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ad060be-35eb-4d9d-8a45-0a387009708c-config-data\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.465626 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63030e19-cf53-4766-aeb1-e25be96a8652-logs\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.465708 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ad060be-35eb-4d9d-8a45-0a387009708c-logs\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.465802 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7sd8\" (UniqueName: \"kubernetes.io/projected/63030e19-cf53-4766-aeb1-e25be96a8652-kube-api-access-d7sd8\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.465884 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-config-data-custom\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.465968 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-logs\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.466052 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-combined-ca-bundle\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.466165 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-config-data-custom\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.466250 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-config-data\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.466762 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad060be-35eb-4d9d-8a45-0a387009708c-combined-ca-bundle\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.466921 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fprzk\" (UniqueName: \"kubernetes.io/projected/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-kube-api-access-fprzk\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.467000 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ad060be-35eb-4d9d-8a45-0a387009708c-config-data-custom\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.467097 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-combined-ca-bundle\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.467200 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-config-data\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.467336 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7q9r\" (UniqueName: \"kubernetes.io/projected/7ad060be-35eb-4d9d-8a45-0a387009708c-kube-api-access-q7q9r\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.466964 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-logs\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.482057 4756 generic.go:334] "Generic (PLEG): container finished" podID="2279186d-a6bf-4a99-a62f-6f1a0a405269" containerID="8afca66c131ab3c8ef6705d5594492be692fd18cebc663ccbb37cae1ccb2b7ba" exitCode=0 Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.466603 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63030e19-cf53-4766-aeb1-e25be96a8652-logs\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.466342 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ad060be-35eb-4d9d-8a45-0a387009708c-logs\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.482621 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.483595 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" event={"ID":"2279186d-a6bf-4a99-a62f-6f1a0a405269","Type":"ContainerDied","Data":"8afca66c131ab3c8ef6705d5594492be692fd18cebc663ccbb37cae1ccb2b7ba"} Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.483639 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.483651 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-t4jg8" event={"ID":"2279186d-a6bf-4a99-a62f-6f1a0a405269","Type":"ContainerDied","Data":"3f195469aa41e3789712c6e6ec5f0169af2c233fba7e27ed1ea819b58516a5a8"} Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.483663 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.483677 4756 scope.go:117] "RemoveContainer" containerID="8afca66c131ab3c8ef6705d5594492be692fd18cebc663ccbb37cae1ccb2b7ba" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.532637 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad060be-35eb-4d9d-8a45-0a387009708c-combined-ca-bundle\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.542443 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7sd8\" (UniqueName: \"kubernetes.io/projected/63030e19-cf53-4766-aeb1-e25be96a8652-kube-api-access-d7sd8\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.542985 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-combined-ca-bundle\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.544474 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-config-data\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.544530 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-config-data\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.544956 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-combined-ca-bundle\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.545490 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ad060be-35eb-4d9d-8a45-0a387009708c-config-data-custom\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.555377 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-config-data-custom\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.556285 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7q9r\" (UniqueName: \"kubernetes.io/projected/7ad060be-35eb-4d9d-8a45-0a387009708c-kube-api-access-q7q9r\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.556525 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-config-data-custom\") pod \"barbican-api-6b67498b8b-xjbnx\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.558693 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fprzk\" (UniqueName: \"kubernetes.io/projected/8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5-kube-api-access-fprzk\") pod \"barbican-worker-6ccf458dc-bmbzj\" (UID: \"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5\") " pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.558929 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ad060be-35eb-4d9d-8a45-0a387009708c-config-data\") pod \"barbican-keystone-listener-7995d6cd86-6kx6b\" (UID: \"7ad060be-35eb-4d9d-8a45-0a387009708c\") " pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.568891 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-ovsdbserver-nb\") pod \"2279186d-a6bf-4a99-a62f-6f1a0a405269\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.568962 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-config\") pod \"2279186d-a6bf-4a99-a62f-6f1a0a405269\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.568994 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89rqv\" (UniqueName: \"kubernetes.io/projected/2279186d-a6bf-4a99-a62f-6f1a0a405269-kube-api-access-89rqv\") pod \"2279186d-a6bf-4a99-a62f-6f1a0a405269\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.569015 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-ovsdbserver-sb\") pod \"2279186d-a6bf-4a99-a62f-6f1a0a405269\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.569069 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-dns-svc\") pod \"2279186d-a6bf-4a99-a62f-6f1a0a405269\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.569087 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-dns-swift-storage-0\") pod \"2279186d-a6bf-4a99-a62f-6f1a0a405269\" (UID: \"2279186d-a6bf-4a99-a62f-6f1a0a405269\") " Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.593802 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.594712 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.603439 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2279186d-a6bf-4a99-a62f-6f1a0a405269-kube-api-access-89rqv" (OuterVolumeSpecName: "kube-api-access-89rqv") pod "2279186d-a6bf-4a99-a62f-6f1a0a405269" (UID: "2279186d-a6bf-4a99-a62f-6f1a0a405269"). InnerVolumeSpecName "kube-api-access-89rqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.673634 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ccf458dc-bmbzj" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.675903 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89rqv\" (UniqueName: \"kubernetes.io/projected/2279186d-a6bf-4a99-a62f-6f1a0a405269-kube-api-access-89rqv\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.683772 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2279186d-a6bf-4a99-a62f-6f1a0a405269" (UID: "2279186d-a6bf-4a99-a62f-6f1a0a405269"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.685189 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-config" (OuterVolumeSpecName: "config") pod "2279186d-a6bf-4a99-a62f-6f1a0a405269" (UID: "2279186d-a6bf-4a99-a62f-6f1a0a405269"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.694659 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.700542 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2279186d-a6bf-4a99-a62f-6f1a0a405269" (UID: "2279186d-a6bf-4a99-a62f-6f1a0a405269"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.702891 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2279186d-a6bf-4a99-a62f-6f1a0a405269" (UID: "2279186d-a6bf-4a99-a62f-6f1a0a405269"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.742607 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2279186d-a6bf-4a99-a62f-6f1a0a405269" (UID: "2279186d-a6bf-4a99-a62f-6f1a0a405269"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.779287 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.779322 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.779331 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.779339 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.779348 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2279186d-a6bf-4a99-a62f-6f1a0a405269-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.898091 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-t4jg8"] Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.911170 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-t4jg8"] Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.922413 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f64d44848-xg692"] Mar 18 14:22:16 crc kubenswrapper[4756]: I0318 14:22:16.986791 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f97847bb-hrplw"] Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.165445 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65fcb6c8bc-k2qdb"] Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.174732 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-788dd8d778-6jnls"] Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.229821 4756 scope.go:117] "RemoveContainer" containerID="bf9f08b75e9442f6ba522deacdcd2f7bb836a85753a6f877b721088042a46c83" Mar 18 14:22:17 crc kubenswrapper[4756]: W0318 14:22:17.265085 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d1b57e3_2938_4c59_a081_487660fa5e9f.slice/crio-63742dc4f0921c89330c4a77995639c01f6e05945cc4557ffcd1e48421fc2863 WatchSource:0}: Error finding container 63742dc4f0921c89330c4a77995639c01f6e05945cc4557ffcd1e48421fc2863: Status 404 returned error can't find the container with id 63742dc4f0921c89330c4a77995639c01f6e05945cc4557ffcd1e48421fc2863 Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.330337 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2279186d-a6bf-4a99-a62f-6f1a0a405269" path="/var/lib/kubelet/pods/2279186d-a6bf-4a99-a62f-6f1a0a405269/volumes" Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.330888 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c833b21a-3aef-4e7e-9cf7-39e675c262ab" path="/var/lib/kubelet/pods/c833b21a-3aef-4e7e-9cf7-39e675c262ab/volumes" Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.613828 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" event={"ID":"13d06afc-d8e7-40dd-b7ea-561655336f97","Type":"ContainerStarted","Data":"0be44c5f3f19e68fbf6dce6bf1b70e6444705e9290d305385912da06ce2c3afc"} Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.629191 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f64d44848-xg692" event={"ID":"e0973f28-44a0-4f66-aadf-42187c9ced68","Type":"ContainerStarted","Data":"29d5190c7a901cfa1799a6d90c0764d8921a8767ee518561b89df1b365404ae2"} Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.633854 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" event={"ID":"5d1b57e3-2938-4c59-a081-487660fa5e9f","Type":"ContainerStarted","Data":"63742dc4f0921c89330c4a77995639c01f6e05945cc4557ffcd1e48421fc2863"} Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.638717 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f97847bb-hrplw" event={"ID":"8d50193a-fddc-4dc4-a597-96d67e28a55b","Type":"ContainerStarted","Data":"64358c4b728140d8efc9e280dce700abf9ed2f863c7c7045c50100b4bb1fae9e"} Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.649940 4756 scope.go:117] "RemoveContainer" containerID="8afca66c131ab3c8ef6705d5594492be692fd18cebc663ccbb37cae1ccb2b7ba" Mar 18 14:22:17 crc kubenswrapper[4756]: E0318 14:22:17.652873 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8afca66c131ab3c8ef6705d5594492be692fd18cebc663ccbb37cae1ccb2b7ba\": container with ID starting with 8afca66c131ab3c8ef6705d5594492be692fd18cebc663ccbb37cae1ccb2b7ba not found: ID does not exist" containerID="8afca66c131ab3c8ef6705d5594492be692fd18cebc663ccbb37cae1ccb2b7ba" Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.652946 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8afca66c131ab3c8ef6705d5594492be692fd18cebc663ccbb37cae1ccb2b7ba"} err="failed to get container status \"8afca66c131ab3c8ef6705d5594492be692fd18cebc663ccbb37cae1ccb2b7ba\": rpc error: code = NotFound desc = could not find container \"8afca66c131ab3c8ef6705d5594492be692fd18cebc663ccbb37cae1ccb2b7ba\": container with ID starting with 8afca66c131ab3c8ef6705d5594492be692fd18cebc663ccbb37cae1ccb2b7ba not found: ID does not exist" Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.652975 4756 scope.go:117] "RemoveContainer" containerID="bf9f08b75e9442f6ba522deacdcd2f7bb836a85753a6f877b721088042a46c83" Mar 18 14:22:17 crc kubenswrapper[4756]: E0318 14:22:17.654842 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf9f08b75e9442f6ba522deacdcd2f7bb836a85753a6f877b721088042a46c83\": container with ID starting with bf9f08b75e9442f6ba522deacdcd2f7bb836a85753a6f877b721088042a46c83 not found: ID does not exist" containerID="bf9f08b75e9442f6ba522deacdcd2f7bb836a85753a6f877b721088042a46c83" Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.654895 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf9f08b75e9442f6ba522deacdcd2f7bb836a85753a6f877b721088042a46c83"} err="failed to get container status \"bf9f08b75e9442f6ba522deacdcd2f7bb836a85753a6f877b721088042a46c83\": rpc error: code = NotFound desc = could not find container \"bf9f08b75e9442f6ba522deacdcd2f7bb836a85753a6f877b721088042a46c83\": container with ID starting with bf9f08b75e9442f6ba522deacdcd2f7bb836a85753a6f877b721088042a46c83 not found: ID does not exist" Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.724250 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.724386 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 14:22:17 crc kubenswrapper[4756]: I0318 14:22:17.770141 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.111661 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b67498b8b-xjbnx"] Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.122740 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9sh7"] Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.381820 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6ccf458dc-bmbzj"] Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.514383 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7995d6cd86-6kx6b"] Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.662723 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b67498b8b-xjbnx" event={"ID":"63030e19-cf53-4766-aeb1-e25be96a8652","Type":"ContainerStarted","Data":"8c2274ece5ebc8a4a190815d4ce05a9cb2a3dc9db13407bf3148be14c303c1b0"} Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.662765 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b67498b8b-xjbnx" event={"ID":"63030e19-cf53-4766-aeb1-e25be96a8652","Type":"ContainerStarted","Data":"3fa843f5c2188d4bae25148714f1eef63b43a8f7d1ed344e3c719b06b53593ae"} Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.672172 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ccf458dc-bmbzj" event={"ID":"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5","Type":"ContainerStarted","Data":"b9a0ea161b420fa0d03762337dbae7e5b146c6a5d9148f813aa445d4d819d099"} Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.695077 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f97847bb-hrplw" event={"ID":"8d50193a-fddc-4dc4-a597-96d67e28a55b","Type":"ContainerStarted","Data":"d64f2b0282847e55d050121638c79416baaf546c7b5340972c9b5c6e4f8e49eb"} Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.695132 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f97847bb-hrplw" event={"ID":"8d50193a-fddc-4dc4-a597-96d67e28a55b","Type":"ContainerStarted","Data":"bde8356d73eeed484d3c8d6dc02ba8f18da4576e7e28c66218ae5f6e3fbe5608"} Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.696189 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.696218 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.705539 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" event={"ID":"7ad060be-35eb-4d9d-8a45-0a387009708c","Type":"ContainerStarted","Data":"247f789ae7d8c67952fab2d1402dacad13261da07e1edf5a2f36fbd484fa4548"} Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.713510 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" event={"ID":"a6759509-7c13-49a6-893a-86605058eabc","Type":"ContainerStarted","Data":"2b72e01f365e79dca0460075d23a884762ad96f2454371ab9b7cb65b691fecde"} Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.713552 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" event={"ID":"a6759509-7c13-49a6-893a-86605058eabc","Type":"ContainerStarted","Data":"92e6cc276b86af550b15b283574207c2e805b6a5bea3b1f6b4944f302da2ecd1"} Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.725067 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f64d44848-xg692" event={"ID":"e0973f28-44a0-4f66-aadf-42187c9ced68","Type":"ContainerStarted","Data":"368231529ab0f5bc9d2f0f568171f58004c4331632ffe37bd9e22ed3449082eb"} Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.725187 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.759028 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5f97847bb-hrplw" podStartSLOduration=3.759007526 podStartE2EDuration="3.759007526s" podCreationTimestamp="2026-03-18 14:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:18.727850064 +0000 UTC m=+1340.042268049" watchObservedRunningTime="2026-03-18 14:22:18.759007526 +0000 UTC m=+1340.073425501" Mar 18 14:22:18 crc kubenswrapper[4756]: I0318 14:22:18.801093 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7f64d44848-xg692" podStartSLOduration=3.801072524 podStartE2EDuration="3.801072524s" podCreationTimestamp="2026-03-18 14:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:18.784597908 +0000 UTC m=+1340.099015893" watchObservedRunningTime="2026-03-18 14:22:18.801072524 +0000 UTC m=+1340.115490499" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.574573 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.575232 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.755001 4756 generic.go:334] "Generic (PLEG): container finished" podID="a6759509-7c13-49a6-893a-86605058eabc" containerID="2b72e01f365e79dca0460075d23a884762ad96f2454371ab9b7cb65b691fecde" exitCode=0 Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.755061 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" event={"ID":"a6759509-7c13-49a6-893a-86605058eabc","Type":"ContainerDied","Data":"2b72e01f365e79dca0460075d23a884762ad96f2454371ab9b7cb65b691fecde"} Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.755090 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" event={"ID":"a6759509-7c13-49a6-893a-86605058eabc","Type":"ContainerStarted","Data":"fdc6356e5a4853b64e2f05704d770128b5465e8a47e7f010752fabaa5d058615"} Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.755499 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.763196 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b67498b8b-xjbnx" event={"ID":"63030e19-cf53-4766-aeb1-e25be96a8652","Type":"ContainerStarted","Data":"9dfd38e589c690724b745aad477d20369b623572a57e28b404a808f7c7437b07"} Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.765092 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.785664 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.794383 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" podStartSLOduration=4.794361151 podStartE2EDuration="4.794361151s" podCreationTimestamp="2026-03-18 14:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:19.779374986 +0000 UTC m=+1341.093792971" watchObservedRunningTime="2026-03-18 14:22:19.794361151 +0000 UTC m=+1341.108779116" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.815303 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b67498b8b-xjbnx" podStartSLOduration=3.8152786770000002 podStartE2EDuration="3.815278677s" podCreationTimestamp="2026-03-18 14:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:19.801054443 +0000 UTC m=+1341.115472418" watchObservedRunningTime="2026-03-18 14:22:19.815278677 +0000 UTC m=+1341.129696672" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.828175 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-58c774f67d-hdzcx"] Mar 18 14:22:19 crc kubenswrapper[4756]: E0318 14:22:19.828611 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2279186d-a6bf-4a99-a62f-6f1a0a405269" containerName="dnsmasq-dns" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.828626 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2279186d-a6bf-4a99-a62f-6f1a0a405269" containerName="dnsmasq-dns" Mar 18 14:22:19 crc kubenswrapper[4756]: E0318 14:22:19.828646 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2279186d-a6bf-4a99-a62f-6f1a0a405269" containerName="init" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.828652 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2279186d-a6bf-4a99-a62f-6f1a0a405269" containerName="init" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.828841 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2279186d-a6bf-4a99-a62f-6f1a0a405269" containerName="dnsmasq-dns" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.829924 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.837197 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58c774f67d-hdzcx"] Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.839572 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.839791 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.941183 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-combined-ca-bundle\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.941230 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-logs\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.941253 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-config-data-custom\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.941294 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-internal-tls-certs\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.941313 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbmkg\" (UniqueName: \"kubernetes.io/projected/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-kube-api-access-zbmkg\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.941356 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-config-data\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:19 crc kubenswrapper[4756]: I0318 14:22:19.941375 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-public-tls-certs\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.043478 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-internal-tls-certs\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.043515 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbmkg\" (UniqueName: \"kubernetes.io/projected/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-kube-api-access-zbmkg\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.043684 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-config-data\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.043711 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-public-tls-certs\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.043960 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-combined-ca-bundle\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.044026 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-logs\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.044057 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-config-data-custom\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.046113 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-logs\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.052574 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-config-data-custom\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.056110 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-config-data\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.061274 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-combined-ca-bundle\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.061818 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-internal-tls-certs\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.061895 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-public-tls-certs\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.064597 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbmkg\" (UniqueName: \"kubernetes.io/projected/ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f-kube-api-access-zbmkg\") pod \"barbican-api-58c774f67d-hdzcx\" (UID: \"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f\") " pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.164959 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:20 crc kubenswrapper[4756]: I0318 14:22:20.775257 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:21 crc kubenswrapper[4756]: I0318 14:22:21.806587 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ccf458dc-bmbzj" event={"ID":"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5","Type":"ContainerStarted","Data":"6600ebed082bfae5ef6ca67b7daa200a196f78077a501efed8143f72165341b9"} Mar 18 14:22:21 crc kubenswrapper[4756]: I0318 14:22:21.809405 4756 generic.go:334] "Generic (PLEG): container finished" podID="9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e" containerID="d2d1fe77d93c754e6515368dd66601e0dd41859e0f19118ebdbf4cde6a187c19" exitCode=0 Mar 18 14:22:21 crc kubenswrapper[4756]: I0318 14:22:21.809472 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rpx8m" event={"ID":"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e","Type":"ContainerDied","Data":"d2d1fe77d93c754e6515368dd66601e0dd41859e0f19118ebdbf4cde6a187c19"} Mar 18 14:22:21 crc kubenswrapper[4756]: I0318 14:22:21.812505 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" event={"ID":"5d1b57e3-2938-4c59-a081-487660fa5e9f","Type":"ContainerStarted","Data":"fb51032520f65fccf327fdb4b4e3d7d94e28657311bd1f96cf4db2506006ee79"} Mar 18 14:22:21 crc kubenswrapper[4756]: I0318 14:22:21.815197 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" event={"ID":"7ad060be-35eb-4d9d-8a45-0a387009708c","Type":"ContainerStarted","Data":"4ba3019c2fff70d7e2dceb6730463da34f5ced510c2010dcb038508ec0287bfe"} Mar 18 14:22:21 crc kubenswrapper[4756]: I0318 14:22:21.822338 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-tpt8q" event={"ID":"7401eed2-7f0c-4f80-932b-5bc2df6684f8","Type":"ContainerStarted","Data":"ec76b4cea8296c216664ec5ba1e3c126b34d14be347f1fd7b7ad8daaf3003eeb"} Mar 18 14:22:21 crc kubenswrapper[4756]: I0318 14:22:21.824901 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" event={"ID":"13d06afc-d8e7-40dd-b7ea-561655336f97","Type":"ContainerStarted","Data":"c006322c4807ea8927587674552655649936fc0097e1d613ce6614c8e698bfe6"} Mar 18 14:22:21 crc kubenswrapper[4756]: I0318 14:22:21.833272 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58c774f67d-hdzcx"] Mar 18 14:22:21 crc kubenswrapper[4756]: W0318 14:22:21.848053 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddfe8af8_356d_4e42_ade9_26a1d1a8cb4f.slice/crio-0305ea652c90a1426a0bea96c2907c4565f0f92a088e62eee6e52dbd5542b773 WatchSource:0}: Error finding container 0305ea652c90a1426a0bea96c2907c4565f0f92a088e62eee6e52dbd5542b773: Status 404 returned error can't find the container with id 0305ea652c90a1426a0bea96c2907c4565f0f92a088e62eee6e52dbd5542b773 Mar 18 14:22:22 crc kubenswrapper[4756]: I0318 14:22:22.842410 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ccf458dc-bmbzj" event={"ID":"8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5","Type":"ContainerStarted","Data":"38c0d5e939bd739a36cb7e4a55e770508a42f1cbb75f6a335eed53f5925f6dba"} Mar 18 14:22:22 crc kubenswrapper[4756]: I0318 14:22:22.847822 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" event={"ID":"5d1b57e3-2938-4c59-a081-487660fa5e9f","Type":"ContainerStarted","Data":"1de276bfcd235177d654a66752588c0b3c0da07806a81a2031ffc851e7adc670"} Mar 18 14:22:22 crc kubenswrapper[4756]: I0318 14:22:22.852144 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" event={"ID":"7ad060be-35eb-4d9d-8a45-0a387009708c","Type":"ContainerStarted","Data":"e8021d8a68cf88f0593168e6a4b4f6cd60229e27aa3e58afa6ede846f5b55551"} Mar 18 14:22:22 crc kubenswrapper[4756]: I0318 14:22:22.856470 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" event={"ID":"13d06afc-d8e7-40dd-b7ea-561655336f97","Type":"ContainerStarted","Data":"b285bec204242680be7c26b14963e685bc7e1552e4001ccd52758b79082b5e57"} Mar 18 14:22:22 crc kubenswrapper[4756]: I0318 14:22:22.866179 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58c774f67d-hdzcx" event={"ID":"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f","Type":"ContainerStarted","Data":"be4ec8619169a9ce9295c3773b8cecd607938a39e3fbef1becd0a44628e38e4a"} Mar 18 14:22:22 crc kubenswrapper[4756]: I0318 14:22:22.866221 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58c774f67d-hdzcx" event={"ID":"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f","Type":"ContainerStarted","Data":"908204806f9722dfaf968ed4f96bed531b31450346d6ad991e936ba49951f38c"} Mar 18 14:22:22 crc kubenswrapper[4756]: I0318 14:22:22.866233 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58c774f67d-hdzcx" event={"ID":"ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f","Type":"ContainerStarted","Data":"0305ea652c90a1426a0bea96c2907c4565f0f92a088e62eee6e52dbd5542b773"} Mar 18 14:22:22 crc kubenswrapper[4756]: I0318 14:22:22.866262 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:22 crc kubenswrapper[4756]: I0318 14:22:22.866501 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:22 crc kubenswrapper[4756]: I0318 14:22:22.884632 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6ccf458dc-bmbzj" podStartSLOduration=3.984570794 podStartE2EDuration="6.884610118s" podCreationTimestamp="2026-03-18 14:22:16 +0000 UTC" firstStartedPulling="2026-03-18 14:22:18.442420926 +0000 UTC m=+1339.756838891" lastFinishedPulling="2026-03-18 14:22:21.34246024 +0000 UTC m=+1342.656878215" observedRunningTime="2026-03-18 14:22:22.881759961 +0000 UTC m=+1344.196177936" watchObservedRunningTime="2026-03-18 14:22:22.884610118 +0000 UTC m=+1344.199028093" Mar 18 14:22:22 crc kubenswrapper[4756]: I0318 14:22:22.897031 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-tpt8q" podStartSLOduration=3.020151592 podStartE2EDuration="52.897009904s" podCreationTimestamp="2026-03-18 14:21:30 +0000 UTC" firstStartedPulling="2026-03-18 14:21:31.469422321 +0000 UTC m=+1292.783840296" lastFinishedPulling="2026-03-18 14:22:21.346280633 +0000 UTC m=+1342.660698608" observedRunningTime="2026-03-18 14:22:21.85467777 +0000 UTC m=+1343.169095745" watchObservedRunningTime="2026-03-18 14:22:22.897009904 +0000 UTC m=+1344.211427879" Mar 18 14:22:22 crc kubenswrapper[4756]: I0318 14:22:22.952329 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-65fcb6c8bc-k2qdb"] Mar 18 14:22:22 crc kubenswrapper[4756]: I0318 14:22:22.962999 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" podStartSLOduration=3.905918668 podStartE2EDuration="7.962975007s" podCreationTimestamp="2026-03-18 14:22:15 +0000 UTC" firstStartedPulling="2026-03-18 14:22:17.286370847 +0000 UTC m=+1338.600788822" lastFinishedPulling="2026-03-18 14:22:21.343427186 +0000 UTC m=+1342.657845161" observedRunningTime="2026-03-18 14:22:22.92275856 +0000 UTC m=+1344.237176525" watchObservedRunningTime="2026-03-18 14:22:22.962975007 +0000 UTC m=+1344.277392982" Mar 18 14:22:23 crc kubenswrapper[4756]: I0318 14:22:22.998678 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7995d6cd86-6kx6b" podStartSLOduration=5.162778042 podStartE2EDuration="7.998655362s" podCreationTimestamp="2026-03-18 14:22:15 +0000 UTC" firstStartedPulling="2026-03-18 14:22:18.506512919 +0000 UTC m=+1339.820930894" lastFinishedPulling="2026-03-18 14:22:21.342390239 +0000 UTC m=+1342.656808214" observedRunningTime="2026-03-18 14:22:22.944883078 +0000 UTC m=+1344.259301063" watchObservedRunningTime="2026-03-18 14:22:22.998655362 +0000 UTC m=+1344.313073337" Mar 18 14:22:23 crc kubenswrapper[4756]: I0318 14:22:23.004788 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" podStartSLOduration=3.945445237 podStartE2EDuration="8.003171115s" podCreationTimestamp="2026-03-18 14:22:15 +0000 UTC" firstStartedPulling="2026-03-18 14:22:17.242297665 +0000 UTC m=+1338.556715640" lastFinishedPulling="2026-03-18 14:22:21.300023543 +0000 UTC m=+1342.614441518" observedRunningTime="2026-03-18 14:22:22.975152857 +0000 UTC m=+1344.289570842" watchObservedRunningTime="2026-03-18 14:22:23.003171115 +0000 UTC m=+1344.317589090" Mar 18 14:22:23 crc kubenswrapper[4756]: I0318 14:22:23.025050 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-788dd8d778-6jnls"] Mar 18 14:22:23 crc kubenswrapper[4756]: I0318 14:22:23.027191 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-58c774f67d-hdzcx" podStartSLOduration=4.027174353 podStartE2EDuration="4.027174353s" podCreationTimestamp="2026-03-18 14:22:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:23.008835338 +0000 UTC m=+1344.323253313" watchObservedRunningTime="2026-03-18 14:22:23.027174353 +0000 UTC m=+1344.341592328" Mar 18 14:22:23 crc kubenswrapper[4756]: I0318 14:22:23.220456 4756 scope.go:117] "RemoveContainer" containerID="822ec18092ea709c0b867cb7ba08e5d135047c4f3f3353b97c7b082f6c6eeea7" Mar 18 14:22:24 crc kubenswrapper[4756]: I0318 14:22:24.883279 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" podUID="13d06afc-d8e7-40dd-b7ea-561655336f97" containerName="barbican-keystone-listener-log" containerID="cri-o://c006322c4807ea8927587674552655649936fc0097e1d613ce6614c8e698bfe6" gracePeriod=30 Mar 18 14:22:24 crc kubenswrapper[4756]: I0318 14:22:24.883491 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" podUID="5d1b57e3-2938-4c59-a081-487660fa5e9f" containerName="barbican-worker-log" containerID="cri-o://fb51032520f65fccf327fdb4b4e3d7d94e28657311bd1f96cf4db2506006ee79" gracePeriod=30 Mar 18 14:22:24 crc kubenswrapper[4756]: I0318 14:22:24.883892 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" podUID="13d06afc-d8e7-40dd-b7ea-561655336f97" containerName="barbican-keystone-listener" containerID="cri-o://b285bec204242680be7c26b14963e685bc7e1552e4001ccd52758b79082b5e57" gracePeriod=30 Mar 18 14:22:24 crc kubenswrapper[4756]: I0318 14:22:24.884109 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" podUID="5d1b57e3-2938-4c59-a081-487660fa5e9f" containerName="barbican-worker" containerID="cri-o://1de276bfcd235177d654a66752588c0b3c0da07806a81a2031ffc851e7adc670" gracePeriod=30 Mar 18 14:22:25 crc kubenswrapper[4756]: I0318 14:22:25.895804 4756 generic.go:334] "Generic (PLEG): container finished" podID="5d1b57e3-2938-4c59-a081-487660fa5e9f" containerID="fb51032520f65fccf327fdb4b4e3d7d94e28657311bd1f96cf4db2506006ee79" exitCode=143 Mar 18 14:22:25 crc kubenswrapper[4756]: I0318 14:22:25.895989 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" event={"ID":"5d1b57e3-2938-4c59-a081-487660fa5e9f","Type":"ContainerDied","Data":"fb51032520f65fccf327fdb4b4e3d7d94e28657311bd1f96cf4db2506006ee79"} Mar 18 14:22:25 crc kubenswrapper[4756]: I0318 14:22:25.898252 4756 generic.go:334] "Generic (PLEG): container finished" podID="7401eed2-7f0c-4f80-932b-5bc2df6684f8" containerID="ec76b4cea8296c216664ec5ba1e3c126b34d14be347f1fd7b7ad8daaf3003eeb" exitCode=0 Mar 18 14:22:25 crc kubenswrapper[4756]: I0318 14:22:25.898290 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-tpt8q" event={"ID":"7401eed2-7f0c-4f80-932b-5bc2df6684f8","Type":"ContainerDied","Data":"ec76b4cea8296c216664ec5ba1e3c126b34d14be347f1fd7b7ad8daaf3003eeb"} Mar 18 14:22:25 crc kubenswrapper[4756]: I0318 14:22:25.900775 4756 generic.go:334] "Generic (PLEG): container finished" podID="13d06afc-d8e7-40dd-b7ea-561655336f97" containerID="c006322c4807ea8927587674552655649936fc0097e1d613ce6614c8e698bfe6" exitCode=143 Mar 18 14:22:25 crc kubenswrapper[4756]: I0318 14:22:25.900832 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" event={"ID":"13d06afc-d8e7-40dd-b7ea-561655336f97","Type":"ContainerDied","Data":"c006322c4807ea8927587674552655649936fc0097e1d613ce6614c8e698bfe6"} Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.433906 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.489066 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-config-data\") pod \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.489180 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-db-sync-config-data\") pod \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.489226 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-etc-machine-id\") pod \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.489299 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-scripts\") pod \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.489325 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26xbh\" (UniqueName: \"kubernetes.io/projected/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-kube-api-access-26xbh\") pod \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.489354 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-combined-ca-bundle\") pod \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\" (UID: \"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e\") " Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.489732 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e" (UID: "9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.489940 4756 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.499562 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-scripts" (OuterVolumeSpecName: "scripts") pod "9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e" (UID: "9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.501977 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e" (UID: "9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.512409 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-kube-api-access-26xbh" (OuterVolumeSpecName: "kube-api-access-26xbh") pod "9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e" (UID: "9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e"). InnerVolumeSpecName "kube-api-access-26xbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.554765 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e" (UID: "9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.577267 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-config-data" (OuterVolumeSpecName: "config-data") pod "9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e" (UID: "9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.591081 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.591180 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.591209 4756 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.591220 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.591230 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26xbh\" (UniqueName: \"kubernetes.io/projected/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e-kube-api-access-26xbh\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.599258 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.662901 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tnw7j"] Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.663191 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" podUID="92a17284-5f6e-4781-b315-374908f04a82" containerName="dnsmasq-dns" containerID="cri-o://d3d30add7ba22b0a1eb3606cff1f24caea20033a1779f1a015e402c8a1c064a5" gracePeriod=10 Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.933859 4756 generic.go:334] "Generic (PLEG): container finished" podID="13d06afc-d8e7-40dd-b7ea-561655336f97" containerID="b285bec204242680be7c26b14963e685bc7e1552e4001ccd52758b79082b5e57" exitCode=0 Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.934958 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" event={"ID":"13d06afc-d8e7-40dd-b7ea-561655336f97","Type":"ContainerDied","Data":"b285bec204242680be7c26b14963e685bc7e1552e4001ccd52758b79082b5e57"} Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.937358 4756 generic.go:334] "Generic (PLEG): container finished" podID="92a17284-5f6e-4781-b315-374908f04a82" containerID="d3d30add7ba22b0a1eb3606cff1f24caea20033a1779f1a015e402c8a1c064a5" exitCode=0 Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.937426 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" event={"ID":"92a17284-5f6e-4781-b315-374908f04a82","Type":"ContainerDied","Data":"d3d30add7ba22b0a1eb3606cff1f24caea20033a1779f1a015e402c8a1c064a5"} Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.942685 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rpx8m" event={"ID":"9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e","Type":"ContainerDied","Data":"de2d616707bd338f3f128c3e5089c8f320f9fc76b5c4e75d6bde34833c945c41"} Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.942729 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de2d616707bd338f3f128c3e5089c8f320f9fc76b5c4e75d6bde34833c945c41" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.942798 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rpx8m" Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.953849 4756 generic.go:334] "Generic (PLEG): container finished" podID="5d1b57e3-2938-4c59-a081-487660fa5e9f" containerID="1de276bfcd235177d654a66752588c0b3c0da07806a81a2031ffc851e7adc670" exitCode=0 Mar 18 14:22:26 crc kubenswrapper[4756]: I0318 14:22:26.953916 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" event={"ID":"5d1b57e3-2938-4c59-a081-487660fa5e9f","Type":"ContainerDied","Data":"1de276bfcd235177d654a66752588c0b3c0da07806a81a2031ffc851e7adc670"} Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.561319 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.572791 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.627987 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-config-data\") pod \"13d06afc-d8e7-40dd-b7ea-561655336f97\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.628058 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-config-data-custom\") pod \"13d06afc-d8e7-40dd-b7ea-561655336f97\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.628098 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13d06afc-d8e7-40dd-b7ea-561655336f97-logs\") pod \"13d06afc-d8e7-40dd-b7ea-561655336f97\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.628139 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-combined-ca-bundle\") pod \"13d06afc-d8e7-40dd-b7ea-561655336f97\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.628166 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d1b57e3-2938-4c59-a081-487660fa5e9f-logs\") pod \"5d1b57e3-2938-4c59-a081-487660fa5e9f\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.628203 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-config-data-custom\") pod \"5d1b57e3-2938-4c59-a081-487660fa5e9f\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.628278 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-combined-ca-bundle\") pod \"5d1b57e3-2938-4c59-a081-487660fa5e9f\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.628298 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b26sv\" (UniqueName: \"kubernetes.io/projected/5d1b57e3-2938-4c59-a081-487660fa5e9f-kube-api-access-b26sv\") pod \"5d1b57e3-2938-4c59-a081-487660fa5e9f\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.628361 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-config-data\") pod \"5d1b57e3-2938-4c59-a081-487660fa5e9f\" (UID: \"5d1b57e3-2938-4c59-a081-487660fa5e9f\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.628441 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6q6p\" (UniqueName: \"kubernetes.io/projected/13d06afc-d8e7-40dd-b7ea-561655336f97-kube-api-access-t6q6p\") pod \"13d06afc-d8e7-40dd-b7ea-561655336f97\" (UID: \"13d06afc-d8e7-40dd-b7ea-561655336f97\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.645188 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13d06afc-d8e7-40dd-b7ea-561655336f97-logs" (OuterVolumeSpecName: "logs") pod "13d06afc-d8e7-40dd-b7ea-561655336f97" (UID: "13d06afc-d8e7-40dd-b7ea-561655336f97"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.646258 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d1b57e3-2938-4c59-a081-487660fa5e9f-logs" (OuterVolumeSpecName: "logs") pod "5d1b57e3-2938-4c59-a081-487660fa5e9f" (UID: "5d1b57e3-2938-4c59-a081-487660fa5e9f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.647296 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5d1b57e3-2938-4c59-a081-487660fa5e9f" (UID: "5d1b57e3-2938-4c59-a081-487660fa5e9f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.658366 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1b57e3-2938-4c59-a081-487660fa5e9f-kube-api-access-b26sv" (OuterVolumeSpecName: "kube-api-access-b26sv") pod "5d1b57e3-2938-4c59-a081-487660fa5e9f" (UID: "5d1b57e3-2938-4c59-a081-487660fa5e9f"). InnerVolumeSpecName "kube-api-access-b26sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.675714 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13d06afc-d8e7-40dd-b7ea-561655336f97-kube-api-access-t6q6p" (OuterVolumeSpecName: "kube-api-access-t6q6p") pod "13d06afc-d8e7-40dd-b7ea-561655336f97" (UID: "13d06afc-d8e7-40dd-b7ea-561655336f97"). InnerVolumeSpecName "kube-api-access-t6q6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.684268 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13d06afc-d8e7-40dd-b7ea-561655336f97" (UID: "13d06afc-d8e7-40dd-b7ea-561655336f97"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.720548 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13d06afc-d8e7-40dd-b7ea-561655336f97" (UID: "13d06afc-d8e7-40dd-b7ea-561655336f97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.735934 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.735968 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b26sv\" (UniqueName: \"kubernetes.io/projected/5d1b57e3-2938-4c59-a081-487660fa5e9f-kube-api-access-b26sv\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.735979 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6q6p\" (UniqueName: \"kubernetes.io/projected/13d06afc-d8e7-40dd-b7ea-561655336f97-kube-api-access-t6q6p\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.735989 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.736000 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13d06afc-d8e7-40dd-b7ea-561655336f97-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.736009 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.736017 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d1b57e3-2938-4c59-a081-487660fa5e9f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.743719 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.744179 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:22:27 crc kubenswrapper[4756]: E0318 14:22:27.744350 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e" containerName="cinder-db-sync" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.744397 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e" containerName="cinder-db-sync" Mar 18 14:22:27 crc kubenswrapper[4756]: E0318 14:22:27.744422 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d06afc-d8e7-40dd-b7ea-561655336f97" containerName="barbican-keystone-listener" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.744429 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d06afc-d8e7-40dd-b7ea-561655336f97" containerName="barbican-keystone-listener" Mar 18 14:22:27 crc kubenswrapper[4756]: E0318 14:22:27.744461 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d06afc-d8e7-40dd-b7ea-561655336f97" containerName="barbican-keystone-listener-log" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.744470 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d06afc-d8e7-40dd-b7ea-561655336f97" containerName="barbican-keystone-listener-log" Mar 18 14:22:27 crc kubenswrapper[4756]: E0318 14:22:27.744483 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1b57e3-2938-4c59-a081-487660fa5e9f" containerName="barbican-worker-log" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.744489 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1b57e3-2938-4c59-a081-487660fa5e9f" containerName="barbican-worker-log" Mar 18 14:22:27 crc kubenswrapper[4756]: E0318 14:22:27.744513 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1b57e3-2938-4c59-a081-487660fa5e9f" containerName="barbican-worker" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.744519 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1b57e3-2938-4c59-a081-487660fa5e9f" containerName="barbican-worker" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.744751 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="13d06afc-d8e7-40dd-b7ea-561655336f97" containerName="barbican-keystone-listener-log" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.744789 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e" containerName="cinder-db-sync" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.744802 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d1b57e3-2938-4c59-a081-487660fa5e9f" containerName="barbican-worker-log" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.744814 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d1b57e3-2938-4c59-a081-487660fa5e9f" containerName="barbican-worker" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.744828 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7401eed2-7f0c-4f80-932b-5bc2df6684f8" containerName="cloudkitty-db-sync" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.744838 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="13d06afc-d8e7-40dd-b7ea-561655336f97" containerName="barbican-keystone-listener" Mar 18 14:22:27 crc kubenswrapper[4756]: E0318 14:22:27.750574 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7401eed2-7f0c-4f80-932b-5bc2df6684f8" containerName="cloudkitty-db-sync" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.750600 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7401eed2-7f0c-4f80-932b-5bc2df6684f8" containerName="cloudkitty-db-sync" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.752719 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.759102 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.759336 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.759541 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ws4s4" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.759828 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.783501 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-config-data" (OuterVolumeSpecName: "config-data") pod "13d06afc-d8e7-40dd-b7ea-561655336f97" (UID: "13d06afc-d8e7-40dd-b7ea-561655336f97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.830788 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.831694 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d1b57e3-2938-4c59-a081-487660fa5e9f" (UID: "5d1b57e3-2938-4c59-a081-487660fa5e9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.837991 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7401eed2-7f0c-4f80-932b-5bc2df6684f8-certs\") pod \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.839466 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-config-data" (OuterVolumeSpecName: "config-data") pod "5d1b57e3-2938-4c59-a081-487660fa5e9f" (UID: "5d1b57e3-2938-4c59-a081-487660fa5e9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.845185 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-combined-ca-bundle\") pod \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.845324 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-scripts\") pod \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.845395 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-config-data\") pod \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.845457 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8tcf\" (UniqueName: \"kubernetes.io/projected/7401eed2-7f0c-4f80-932b-5bc2df6684f8-kube-api-access-l8tcf\") pod \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\" (UID: \"7401eed2-7f0c-4f80-932b-5bc2df6684f8\") " Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.845758 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m47qq\" (UniqueName: \"kubernetes.io/projected/44ffe1f5-1bf5-4597-946e-af93cafc35e4-kube-api-access-m47qq\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.845846 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-config-data\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.846037 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-scripts\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.846066 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44ffe1f5-1bf5-4597-946e-af93cafc35e4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.846179 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.846215 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.846311 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13d06afc-d8e7-40dd-b7ea-561655336f97-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.846330 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.846348 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d1b57e3-2938-4c59-a081-487660fa5e9f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.852678 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-scripts" (OuterVolumeSpecName: "scripts") pod "7401eed2-7f0c-4f80-932b-5bc2df6684f8" (UID: "7401eed2-7f0c-4f80-932b-5bc2df6684f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.852876 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7401eed2-7f0c-4f80-932b-5bc2df6684f8-certs" (OuterVolumeSpecName: "certs") pod "7401eed2-7f0c-4f80-932b-5bc2df6684f8" (UID: "7401eed2-7f0c-4f80-932b-5bc2df6684f8"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.854321 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7401eed2-7f0c-4f80-932b-5bc2df6684f8-kube-api-access-l8tcf" (OuterVolumeSpecName: "kube-api-access-l8tcf") pod "7401eed2-7f0c-4f80-932b-5bc2df6684f8" (UID: "7401eed2-7f0c-4f80-932b-5bc2df6684f8"). InnerVolumeSpecName "kube-api-access-l8tcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.862084 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-px8hg"] Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.869647 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.871860 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-px8hg"] Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.906682 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7401eed2-7f0c-4f80-932b-5bc2df6684f8" (UID: "7401eed2-7f0c-4f80-932b-5bc2df6684f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.914371 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.915944 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.919074 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.925835 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-config-data" (OuterVolumeSpecName: "config-data") pod "7401eed2-7f0c-4f80-932b-5bc2df6684f8" (UID: "7401eed2-7f0c-4f80-932b-5bc2df6684f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.946191 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947243 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947266 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzhlg\" (UniqueName: \"kubernetes.io/projected/f59ed4a2-e058-4c0f-acec-68e717c8da59-kube-api-access-tzhlg\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947302 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947320 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947342 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947357 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-config-data\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947381 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947408 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m47qq\" (UniqueName: \"kubernetes.io/projected/44ffe1f5-1bf5-4597-946e-af93cafc35e4-kube-api-access-m47qq\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947440 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-config-data\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947488 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f3ac7084-50e8-406f-90f5-2cf4d2350935-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947505 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-scripts\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947520 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdlkz\" (UniqueName: \"kubernetes.io/projected/f3ac7084-50e8-406f-90f5-2cf4d2350935-kube-api-access-pdlkz\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947541 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947559 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947598 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-config\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947614 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3ac7084-50e8-406f-90f5-2cf4d2350935-logs\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947633 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-config-data-custom\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947657 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-scripts\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947674 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44ffe1f5-1bf5-4597-946e-af93cafc35e4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947728 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947740 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947749 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7401eed2-7f0c-4f80-932b-5bc2df6684f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947757 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8tcf\" (UniqueName: \"kubernetes.io/projected/7401eed2-7f0c-4f80-932b-5bc2df6684f8-kube-api-access-l8tcf\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947767 4756 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7401eed2-7f0c-4f80-932b-5bc2df6684f8-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.947804 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44ffe1f5-1bf5-4597-946e-af93cafc35e4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.951886 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-config-data\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.952506 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-scripts\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.956104 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.961841 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.968742 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.983750 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m47qq\" (UniqueName: \"kubernetes.io/projected/44ffe1f5-1bf5-4597-946e-af93cafc35e4-kube-api-access-m47qq\") pod \"cinder-scheduler-0\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.988603 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a791a6a-0e50-465a-90cf-e4af5bdc12de","Type":"ContainerStarted","Data":"2c9cadb470d69e4ba22f022cd3177e4dd2e4e7426da004bcf5d2250205c1d4f1"} Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.988757 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="ceilometer-central-agent" containerID="cri-o://6cd2f7cec824393628fb88948e4665aad058ef37c40b05f1165872d6909f1e3a" gracePeriod=30 Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.988968 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.989228 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="proxy-httpd" containerID="cri-o://2c9cadb470d69e4ba22f022cd3177e4dd2e4e7426da004bcf5d2250205c1d4f1" gracePeriod=30 Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.989276 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="sg-core" containerID="cri-o://544736d183a3cbf62f7ad025b7e67c4b647a9fe7b073a0f3ad059d28034fcae6" gracePeriod=30 Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.989310 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="ceilometer-notification-agent" containerID="cri-o://7a34dbea7ba5ecf140e7fb7f99455f2180277a988725a32294c2b9450d52ab50" gracePeriod=30 Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.997190 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" event={"ID":"5d1b57e3-2938-4c59-a081-487660fa5e9f","Type":"ContainerDied","Data":"63742dc4f0921c89330c4a77995639c01f6e05945cc4557ffcd1e48421fc2863"} Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.997389 4756 scope.go:117] "RemoveContainer" containerID="1de276bfcd235177d654a66752588c0b3c0da07806a81a2031ffc851e7adc670" Mar 18 14:22:27 crc kubenswrapper[4756]: I0318 14:22:27.997557 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65fcb6c8bc-k2qdb" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.007946 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" event={"ID":"92a17284-5f6e-4781-b315-374908f04a82","Type":"ContainerDied","Data":"8bb16264848ac922a3719f0e674b0491997b9d16d644a519ae3bc932454148f0"} Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.008230 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-tnw7j" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.022197 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-tpt8q" event={"ID":"7401eed2-7f0c-4f80-932b-5bc2df6684f8","Type":"ContainerDied","Data":"613d75d846910596582c301beac7223f31a73f201fe536624a7a35b859389892"} Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.022239 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="613d75d846910596582c301beac7223f31a73f201fe536624a7a35b859389892" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.022306 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-tpt8q" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.027967 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.30744694 podStartE2EDuration="58.027947989s" podCreationTimestamp="2026-03-18 14:21:30 +0000 UTC" firstStartedPulling="2026-03-18 14:21:31.838348057 +0000 UTC m=+1293.152766032" lastFinishedPulling="2026-03-18 14:22:27.558849106 +0000 UTC m=+1348.873267081" observedRunningTime="2026-03-18 14:22:28.012896053 +0000 UTC m=+1349.327314028" watchObservedRunningTime="2026-03-18 14:22:28.027947989 +0000 UTC m=+1349.342365964" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.032165 4756 scope.go:117] "RemoveContainer" containerID="fb51032520f65fccf327fdb4b4e3d7d94e28657311bd1f96cf4db2506006ee79" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049025 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-ovsdbserver-nb\") pod \"92a17284-5f6e-4781-b315-374908f04a82\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049146 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6775\" (UniqueName: \"kubernetes.io/projected/92a17284-5f6e-4781-b315-374908f04a82-kube-api-access-f6775\") pod \"92a17284-5f6e-4781-b315-374908f04a82\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049179 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-dns-svc\") pod \"92a17284-5f6e-4781-b315-374908f04a82\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049227 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-dns-swift-storage-0\") pod \"92a17284-5f6e-4781-b315-374908f04a82\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049283 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-config\") pod \"92a17284-5f6e-4781-b315-374908f04a82\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049403 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-ovsdbserver-sb\") pod \"92a17284-5f6e-4781-b315-374908f04a82\" (UID: \"92a17284-5f6e-4781-b315-374908f04a82\") " Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049633 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049728 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f3ac7084-50e8-406f-90f5-2cf4d2350935-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049751 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdlkz\" (UniqueName: \"kubernetes.io/projected/f3ac7084-50e8-406f-90f5-2cf4d2350935-kube-api-access-pdlkz\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049769 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-scripts\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049787 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049808 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049831 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-config\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049849 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3ac7084-50e8-406f-90f5-2cf4d2350935-logs\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049865 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-config-data-custom\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049910 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049928 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzhlg\" (UniqueName: \"kubernetes.io/projected/f59ed4a2-e058-4c0f-acec-68e717c8da59-kube-api-access-tzhlg\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049956 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.049976 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-config-data\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.050646 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3ac7084-50e8-406f-90f5-2cf4d2350935-logs\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.053227 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" event={"ID":"13d06afc-d8e7-40dd-b7ea-561655336f97","Type":"ContainerDied","Data":"0be44c5f3f19e68fbf6dce6bf1b70e6444705e9290d305385912da06ce2c3afc"} Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.053447 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-788dd8d778-6jnls" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.054876 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f3ac7084-50e8-406f-90f5-2cf4d2350935-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.055824 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.055984 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.056555 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.056754 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-config\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.057580 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.071108 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-config-data\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.076707 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzhlg\" (UniqueName: \"kubernetes.io/projected/f59ed4a2-e058-4c0f-acec-68e717c8da59-kube-api-access-tzhlg\") pod \"dnsmasq-dns-5c9776ccc5-px8hg\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.076738 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a17284-5f6e-4781-b315-374908f04a82-kube-api-access-f6775" (OuterVolumeSpecName: "kube-api-access-f6775") pod "92a17284-5f6e-4781-b315-374908f04a82" (UID: "92a17284-5f6e-4781-b315-374908f04a82"). InnerVolumeSpecName "kube-api-access-f6775". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.077183 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdlkz\" (UniqueName: \"kubernetes.io/projected/f3ac7084-50e8-406f-90f5-2cf4d2350935-kube-api-access-pdlkz\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.083714 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-scripts\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.083719 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.086291 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-config-data-custom\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.086460 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.095480 4756 scope.go:117] "RemoveContainer" containerID="d3d30add7ba22b0a1eb3606cff1f24caea20033a1779f1a015e402c8a1c064a5" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.097179 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-65fcb6c8bc-k2qdb"] Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.139166 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-65fcb6c8bc-k2qdb"] Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.147095 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "92a17284-5f6e-4781-b315-374908f04a82" (UID: "92a17284-5f6e-4781-b315-374908f04a82"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.149633 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-config" (OuterVolumeSpecName: "config") pod "92a17284-5f6e-4781-b315-374908f04a82" (UID: "92a17284-5f6e-4781-b315-374908f04a82"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.152001 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.152024 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.152035 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6775\" (UniqueName: \"kubernetes.io/projected/92a17284-5f6e-4781-b315-374908f04a82-kube-api-access-f6775\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.156558 4756 scope.go:117] "RemoveContainer" containerID="b76c8e507d83330b6e047742ff88e87f9d6ed89a4c571db8768e44f329878e5c" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.156938 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-4gwt6"] Mar 18 14:22:28 crc kubenswrapper[4756]: E0318 14:22:28.157423 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a17284-5f6e-4781-b315-374908f04a82" containerName="dnsmasq-dns" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.157443 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a17284-5f6e-4781-b315-374908f04a82" containerName="dnsmasq-dns" Mar 18 14:22:28 crc kubenswrapper[4756]: E0318 14:22:28.157459 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a17284-5f6e-4781-b315-374908f04a82" containerName="init" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.157465 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a17284-5f6e-4781-b315-374908f04a82" containerName="init" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.157598 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "92a17284-5f6e-4781-b315-374908f04a82" (UID: "92a17284-5f6e-4781-b315-374908f04a82"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.157661 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a17284-5f6e-4781-b315-374908f04a82" containerName="dnsmasq-dns" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.159577 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.160855 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "92a17284-5f6e-4781-b315-374908f04a82" (UID: "92a17284-5f6e-4781-b315-374908f04a82"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.161996 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.161999 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-d7k67" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.162333 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.162478 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.162787 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.168631 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-4gwt6"] Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.181276 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-788dd8d778-6jnls"] Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.187023 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.191383 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-788dd8d778-6jnls"] Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.207039 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "92a17284-5f6e-4781-b315-374908f04a82" (UID: "92a17284-5f6e-4781-b315-374908f04a82"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.221644 4756 scope.go:117] "RemoveContainer" containerID="b285bec204242680be7c26b14963e685bc7e1552e4001ccd52758b79082b5e57" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.240558 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.252594 4756 scope.go:117] "RemoveContainer" containerID="c006322c4807ea8927587674552655649936fc0097e1d613ce6614c8e698bfe6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.254552 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-certs\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.254606 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-combined-ca-bundle\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.254669 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-scripts\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.254802 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-config-data\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.254856 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrqhl\" (UniqueName: \"kubernetes.io/projected/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-kube-api-access-hrqhl\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.254962 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.254977 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.254987 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92a17284-5f6e-4781-b315-374908f04a82-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.357793 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tnw7j"] Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.358872 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-config-data\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.358913 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrqhl\" (UniqueName: \"kubernetes.io/projected/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-kube-api-access-hrqhl\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.358961 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-certs\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.358983 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-combined-ca-bundle\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.359036 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-scripts\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.365502 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-combined-ca-bundle\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.366697 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-certs\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.367263 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tnw7j"] Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.381814 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrqhl\" (UniqueName: \"kubernetes.io/projected/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-kube-api-access-hrqhl\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.383814 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-config-data\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.384198 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-scripts\") pod \"cloudkitty-storageinit-4gwt6\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.477860 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:28 crc kubenswrapper[4756]: E0318 14:22:28.702859 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a791a6a_0e50_465a_90cf_e4af5bdc12de.slice/crio-conmon-6cd2f7cec824393628fb88948e4665aad058ef37c40b05f1165872d6909f1e3a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a791a6a_0e50_465a_90cf_e4af5bdc12de.slice/crio-6cd2f7cec824393628fb88948e4665aad058ef37c40b05f1165872d6909f1e3a.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.704446 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 14:22:28 crc kubenswrapper[4756]: W0318 14:22:28.724182 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44ffe1f5_1bf5_4597_946e_af93cafc35e4.slice/crio-105a30526ea3488528884279c67353ce9dacb59302ab6410df33432e157a45f4 WatchSource:0}: Error finding container 105a30526ea3488528884279c67353ce9dacb59302ab6410df33432e157a45f4: Status 404 returned error can't find the container with id 105a30526ea3488528884279c67353ce9dacb59302ab6410df33432e157a45f4 Mar 18 14:22:28 crc kubenswrapper[4756]: I0318 14:22:28.899194 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-px8hg"] Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.011317 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.028732 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.094252 4756 generic.go:334] "Generic (PLEG): container finished" podID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerID="544736d183a3cbf62f7ad025b7e67c4b647a9fe7b073a0f3ad059d28034fcae6" exitCode=2 Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.094298 4756 generic.go:334] "Generic (PLEG): container finished" podID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerID="6cd2f7cec824393628fb88948e4665aad058ef37c40b05f1165872d6909f1e3a" exitCode=0 Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.094346 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a791a6a-0e50-465a-90cf-e4af5bdc12de","Type":"ContainerDied","Data":"544736d183a3cbf62f7ad025b7e67c4b647a9fe7b073a0f3ad059d28034fcae6"} Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.094392 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a791a6a-0e50-465a-90cf-e4af5bdc12de","Type":"ContainerDied","Data":"6cd2f7cec824393628fb88948e4665aad058ef37c40b05f1165872d6909f1e3a"} Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.117150 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f3ac7084-50e8-406f-90f5-2cf4d2350935","Type":"ContainerStarted","Data":"2b2d90cb48883ef2efa049ac425da655351111d655aecf504e0729f05013936f"} Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.121897 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"44ffe1f5-1bf5-4597-946e-af93cafc35e4","Type":"ContainerStarted","Data":"105a30526ea3488528884279c67353ce9dacb59302ab6410df33432e157a45f4"} Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.146252 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" event={"ID":"f59ed4a2-e058-4c0f-acec-68e717c8da59","Type":"ContainerStarted","Data":"7fda424f3b7ee3b4819084e0e00fcfef912b7672dabe8a0dda79cef3e2a69df8"} Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.193987 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.270573 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-4gwt6"] Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.406197 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13d06afc-d8e7-40dd-b7ea-561655336f97" path="/var/lib/kubelet/pods/13d06afc-d8e7-40dd-b7ea-561655336f97/volumes" Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.406840 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1b57e3-2938-4c59-a081-487660fa5e9f" path="/var/lib/kubelet/pods/5d1b57e3-2938-4c59-a081-487660fa5e9f/volumes" Mar 18 14:22:29 crc kubenswrapper[4756]: I0318 14:22:29.410242 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a17284-5f6e-4781-b315-374908f04a82" path="/var/lib/kubelet/pods/92a17284-5f6e-4781-b315-374908f04a82/volumes" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.208461 4756 generic.go:334] "Generic (PLEG): container finished" podID="f59ed4a2-e058-4c0f-acec-68e717c8da59" containerID="1ff93633ac3b1e12e773533c04ffe9217a2942b4623d4bae8da29d1ac504c444" exitCode=0 Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.208911 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" event={"ID":"f59ed4a2-e058-4c0f-acec-68e717c8da59","Type":"ContainerDied","Data":"1ff93633ac3b1e12e773533c04ffe9217a2942b4623d4bae8da29d1ac504c444"} Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.218515 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-4gwt6" event={"ID":"7001fa44-03c0-4a84-aa50-3005a9c4e1ed","Type":"ContainerStarted","Data":"329468c575a50e3429f544e362dd65cdb79dc698e77fa8f01bfa0c996edd253c"} Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.218561 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-4gwt6" event={"ID":"7001fa44-03c0-4a84-aa50-3005a9c4e1ed","Type":"ContainerStarted","Data":"8c68f31e8dcee44dab9b7a270743c65b63894fc9e5d9fcb272254044c2d73e34"} Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.230808 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f3ac7084-50e8-406f-90f5-2cf4d2350935","Type":"ContainerStarted","Data":"a7f0050fbff4930b6df477526c56e2770bd369a97e76593023a51fda2f706ae7"} Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.319799 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-4gwt6" podStartSLOduration=2.319779779 podStartE2EDuration="2.319779779s" podCreationTimestamp="2026-03-18 14:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:30.291185285 +0000 UTC m=+1351.605603260" watchObservedRunningTime="2026-03-18 14:22:30.319779779 +0000 UTC m=+1351.634197754" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.366636 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.389854 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56db767868-svvqr" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.759556 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c574c99c-tvw8d"] Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.760047 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c574c99c-tvw8d" podUID="5c5360dd-3416-47bb-9e45-b8517121fd45" containerName="neutron-api" containerID="cri-o://9a9e9240c04440aab63f8bb3a1f0c3b56e111402c99293eab5524b0c7fbcd6d1" gracePeriod=30 Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.761198 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c574c99c-tvw8d" podUID="5c5360dd-3416-47bb-9e45-b8517121fd45" containerName="neutron-httpd" containerID="cri-o://5f31b8b654771aa8604a532288d6818a726c175229175cbaafe4ea4b741b7df0" gracePeriod=30 Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.796632 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-766db9c4f-fbsb4"] Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.803377 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.814296 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-766db9c4f-fbsb4"] Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.858235 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-ovndb-tls-certs\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.858279 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-config\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.858332 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxrps\" (UniqueName: \"kubernetes.io/projected/05701c8c-2d0a-47aa-be93-6e037492fb49-kube-api-access-lxrps\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.858395 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-internal-tls-certs\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.858415 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-httpd-config\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.858434 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-public-tls-certs\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.858464 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-combined-ca-bundle\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.872276 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5c574c99c-tvw8d" podUID="5c5360dd-3416-47bb-9e45-b8517121fd45" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.178:9696/\": read tcp 10.217.0.2:41850->10.217.0.178:9696: read: connection reset by peer" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.960263 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-internal-tls-certs\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.960309 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-httpd-config\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.960325 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-public-tls-certs\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.960354 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-combined-ca-bundle\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.960448 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-ovndb-tls-certs\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.960469 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-config\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.960501 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxrps\" (UniqueName: \"kubernetes.io/projected/05701c8c-2d0a-47aa-be93-6e037492fb49-kube-api-access-lxrps\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.968884 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-internal-tls-certs\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.973939 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-combined-ca-bundle\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.974734 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-httpd-config\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.978661 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-ovndb-tls-certs\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.981853 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-public-tls-certs\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.984072 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/05701c8c-2d0a-47aa-be93-6e037492fb49-config\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:30 crc kubenswrapper[4756]: I0318 14:22:30.987875 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxrps\" (UniqueName: \"kubernetes.io/projected/05701c8c-2d0a-47aa-be93-6e037492fb49-kube-api-access-lxrps\") pod \"neutron-766db9c4f-fbsb4\" (UID: \"05701c8c-2d0a-47aa-be93-6e037492fb49\") " pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:31 crc kubenswrapper[4756]: I0318 14:22:31.140568 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:31 crc kubenswrapper[4756]: I0318 14:22:31.253635 4756 generic.go:334] "Generic (PLEG): container finished" podID="5c5360dd-3416-47bb-9e45-b8517121fd45" containerID="5f31b8b654771aa8604a532288d6818a726c175229175cbaafe4ea4b741b7df0" exitCode=0 Mar 18 14:22:31 crc kubenswrapper[4756]: I0318 14:22:31.253709 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c574c99c-tvw8d" event={"ID":"5c5360dd-3416-47bb-9e45-b8517121fd45","Type":"ContainerDied","Data":"5f31b8b654771aa8604a532288d6818a726c175229175cbaafe4ea4b741b7df0"} Mar 18 14:22:31 crc kubenswrapper[4756]: I0318 14:22:31.257630 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f3ac7084-50e8-406f-90f5-2cf4d2350935","Type":"ContainerStarted","Data":"aa31d4b7a655bec3dc2cc33e5a9a40a1fb83f5ff25db2e115ddea543f06c9dc7"} Mar 18 14:22:31 crc kubenswrapper[4756]: I0318 14:22:31.257770 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f3ac7084-50e8-406f-90f5-2cf4d2350935" containerName="cinder-api-log" containerID="cri-o://a7f0050fbff4930b6df477526c56e2770bd369a97e76593023a51fda2f706ae7" gracePeriod=30 Mar 18 14:22:31 crc kubenswrapper[4756]: I0318 14:22:31.258053 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 14:22:31 crc kubenswrapper[4756]: I0318 14:22:31.258312 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f3ac7084-50e8-406f-90f5-2cf4d2350935" containerName="cinder-api" containerID="cri-o://aa31d4b7a655bec3dc2cc33e5a9a40a1fb83f5ff25db2e115ddea543f06c9dc7" gracePeriod=30 Mar 18 14:22:31 crc kubenswrapper[4756]: I0318 14:22:31.272202 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"44ffe1f5-1bf5-4597-946e-af93cafc35e4","Type":"ContainerStarted","Data":"57f970cd2f535c245f5098bb2d2f042956c4b8735df33702323e9f0d6b066cbb"} Mar 18 14:22:31 crc kubenswrapper[4756]: I0318 14:22:31.286928 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.286907359 podStartE2EDuration="4.286907359s" podCreationTimestamp="2026-03-18 14:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:31.280508146 +0000 UTC m=+1352.594926131" watchObservedRunningTime="2026-03-18 14:22:31.286907359 +0000 UTC m=+1352.601325334" Mar 18 14:22:31 crc kubenswrapper[4756]: I0318 14:22:31.292199 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" event={"ID":"f59ed4a2-e058-4c0f-acec-68e717c8da59","Type":"ContainerStarted","Data":"dfc1ce6a8e27d688914c58c1e367e829a7de76ff4d25ad0dfa26724b42b5c27a"} Mar 18 14:22:31 crc kubenswrapper[4756]: I0318 14:22:31.292482 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:31 crc kubenswrapper[4756]: I0318 14:22:31.328798 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" podStartSLOduration=4.328776211 podStartE2EDuration="4.328776211s" podCreationTimestamp="2026-03-18 14:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:31.31950942 +0000 UTC m=+1352.633927395" watchObservedRunningTime="2026-03-18 14:22:31.328776211 +0000 UTC m=+1352.643194186" Mar 18 14:22:31 crc kubenswrapper[4756]: I0318 14:22:31.809458 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-766db9c4f-fbsb4"] Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.300206 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-766db9c4f-fbsb4" event={"ID":"05701c8c-2d0a-47aa-be93-6e037492fb49","Type":"ContainerStarted","Data":"bcf10d59589ce9938e85fe064a765da8e6f03e03a80537d61022a5cfcd4c4dd4"} Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.301346 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-766db9c4f-fbsb4" event={"ID":"05701c8c-2d0a-47aa-be93-6e037492fb49","Type":"ContainerStarted","Data":"9745c3aca4bcdcde24c5492a68a874436a7248bf33c156c2dadb4623855f2b17"} Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.301457 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-766db9c4f-fbsb4" event={"ID":"05701c8c-2d0a-47aa-be93-6e037492fb49","Type":"ContainerStarted","Data":"722802229f865cba5614c94f37f0341d355aac0a83f68a0516edfa0d115b8bf8"} Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.301504 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.302534 4756 generic.go:334] "Generic (PLEG): container finished" podID="7001fa44-03c0-4a84-aa50-3005a9c4e1ed" containerID="329468c575a50e3429f544e362dd65cdb79dc698e77fa8f01bfa0c996edd253c" exitCode=0 Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.302592 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-4gwt6" event={"ID":"7001fa44-03c0-4a84-aa50-3005a9c4e1ed","Type":"ContainerDied","Data":"329468c575a50e3429f544e362dd65cdb79dc698e77fa8f01bfa0c996edd253c"} Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.304937 4756 generic.go:334] "Generic (PLEG): container finished" podID="f3ac7084-50e8-406f-90f5-2cf4d2350935" containerID="a7f0050fbff4930b6df477526c56e2770bd369a97e76593023a51fda2f706ae7" exitCode=143 Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.304992 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f3ac7084-50e8-406f-90f5-2cf4d2350935","Type":"ContainerDied","Data":"a7f0050fbff4930b6df477526c56e2770bd369a97e76593023a51fda2f706ae7"} Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.306954 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"44ffe1f5-1bf5-4597-946e-af93cafc35e4","Type":"ContainerStarted","Data":"d57ad1ac9116b5cd3b2e37e876daff68f2ecb8deff212532ba0e1820ad42fb08"} Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.320826 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-766db9c4f-fbsb4" podStartSLOduration=2.320809595 podStartE2EDuration="2.320809595s" podCreationTimestamp="2026-03-18 14:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:32.31767737 +0000 UTC m=+1353.632095345" watchObservedRunningTime="2026-03-18 14:22:32.320809595 +0000 UTC m=+1353.635227570" Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.364872 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.126359868 podStartE2EDuration="5.364855336s" podCreationTimestamp="2026-03-18 14:22:27 +0000 UTC" firstStartedPulling="2026-03-18 14:22:28.75173928 +0000 UTC m=+1350.066157255" lastFinishedPulling="2026-03-18 14:22:29.990234748 +0000 UTC m=+1351.304652723" observedRunningTime="2026-03-18 14:22:32.364164417 +0000 UTC m=+1353.678582402" watchObservedRunningTime="2026-03-18 14:22:32.364855336 +0000 UTC m=+1353.679273311" Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.531541 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.558469 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5c574c99c-tvw8d" podUID="5c5360dd-3416-47bb-9e45-b8517121fd45" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.178:9696/\": dial tcp 10.217.0.178:9696: connect: connection refused" Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.665652 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58c774f67d-hdzcx" Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.718040 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b67498b8b-xjbnx"] Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.718449 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b67498b8b-xjbnx" podUID="63030e19-cf53-4766-aeb1-e25be96a8652" containerName="barbican-api-log" containerID="cri-o://8c2274ece5ebc8a4a190815d4ce05a9cb2a3dc9db13407bf3148be14c303c1b0" gracePeriod=30 Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.718627 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b67498b8b-xjbnx" podUID="63030e19-cf53-4766-aeb1-e25be96a8652" containerName="barbican-api" containerID="cri-o://9dfd38e589c690724b745aad477d20369b623572a57e28b404a808f7c7437b07" gracePeriod=30 Mar 18 14:22:32 crc kubenswrapper[4756]: I0318 14:22:32.724545 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6b67498b8b-xjbnx" podUID="63030e19-cf53-4766-aeb1-e25be96a8652" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.186:9311/healthcheck\": EOF" Mar 18 14:22:33 crc kubenswrapper[4756]: I0318 14:22:33.084876 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 14:22:33 crc kubenswrapper[4756]: I0318 14:22:33.329464 4756 generic.go:334] "Generic (PLEG): container finished" podID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerID="7a34dbea7ba5ecf140e7fb7f99455f2180277a988725a32294c2b9450d52ab50" exitCode=0 Mar 18 14:22:33 crc kubenswrapper[4756]: I0318 14:22:33.336459 4756 generic.go:334] "Generic (PLEG): container finished" podID="63030e19-cf53-4766-aeb1-e25be96a8652" containerID="8c2274ece5ebc8a4a190815d4ce05a9cb2a3dc9db13407bf3148be14c303c1b0" exitCode=143 Mar 18 14:22:33 crc kubenswrapper[4756]: I0318 14:22:33.356185 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a791a6a-0e50-465a-90cf-e4af5bdc12de","Type":"ContainerDied","Data":"7a34dbea7ba5ecf140e7fb7f99455f2180277a988725a32294c2b9450d52ab50"} Mar 18 14:22:33 crc kubenswrapper[4756]: I0318 14:22:33.356224 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b67498b8b-xjbnx" event={"ID":"63030e19-cf53-4766-aeb1-e25be96a8652","Type":"ContainerDied","Data":"8c2274ece5ebc8a4a190815d4ce05a9cb2a3dc9db13407bf3148be14c303c1b0"} Mar 18 14:22:33 crc kubenswrapper[4756]: I0318 14:22:33.936643 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.063900 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-certs\") pod \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.063991 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-combined-ca-bundle\") pod \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.064157 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-scripts\") pod \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.064223 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrqhl\" (UniqueName: \"kubernetes.io/projected/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-kube-api-access-hrqhl\") pod \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.064277 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-config-data\") pod \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\" (UID: \"7001fa44-03c0-4a84-aa50-3005a9c4e1ed\") " Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.078369 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-kube-api-access-hrqhl" (OuterVolumeSpecName: "kube-api-access-hrqhl") pod "7001fa44-03c0-4a84-aa50-3005a9c4e1ed" (UID: "7001fa44-03c0-4a84-aa50-3005a9c4e1ed"). InnerVolumeSpecName "kube-api-access-hrqhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.093261 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-scripts" (OuterVolumeSpecName: "scripts") pod "7001fa44-03c0-4a84-aa50-3005a9c4e1ed" (UID: "7001fa44-03c0-4a84-aa50-3005a9c4e1ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.106370 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-certs" (OuterVolumeSpecName: "certs") pod "7001fa44-03c0-4a84-aa50-3005a9c4e1ed" (UID: "7001fa44-03c0-4a84-aa50-3005a9c4e1ed"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.139961 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7001fa44-03c0-4a84-aa50-3005a9c4e1ed" (UID: "7001fa44-03c0-4a84-aa50-3005a9c4e1ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.151314 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-config-data" (OuterVolumeSpecName: "config-data") pod "7001fa44-03c0-4a84-aa50-3005a9c4e1ed" (UID: "7001fa44-03c0-4a84-aa50-3005a9c4e1ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.166871 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.166902 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrqhl\" (UniqueName: \"kubernetes.io/projected/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-kube-api-access-hrqhl\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.166912 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.166921 4756 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.166931 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7001fa44-03c0-4a84-aa50-3005a9c4e1ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.346895 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-4gwt6" event={"ID":"7001fa44-03c0-4a84-aa50-3005a9c4e1ed","Type":"ContainerDied","Data":"8c68f31e8dcee44dab9b7a270743c65b63894fc9e5d9fcb272254044c2d73e34"} Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.346948 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c68f31e8dcee44dab9b7a270743c65b63894fc9e5d9fcb272254044c2d73e34" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.346956 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-4gwt6" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.523780 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:22:34 crc kubenswrapper[4756]: E0318 14:22:34.524187 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7001fa44-03c0-4a84-aa50-3005a9c4e1ed" containerName="cloudkitty-storageinit" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.524201 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7001fa44-03c0-4a84-aa50-3005a9c4e1ed" containerName="cloudkitty-storageinit" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.524383 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7001fa44-03c0-4a84-aa50-3005a9c4e1ed" containerName="cloudkitty-storageinit" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.527677 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.535373 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.535625 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.535743 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-d7k67" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.535820 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.536048 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.557362 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.629268 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-px8hg"] Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.629523 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" podUID="f59ed4a2-e058-4c0f-acec-68e717c8da59" containerName="dnsmasq-dns" containerID="cri-o://dfc1ce6a8e27d688914c58c1e367e829a7de76ff4d25ad0dfa26724b42b5c27a" gracePeriod=10 Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.658844 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-vbm9s"] Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.661659 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.677277 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5da0e296-5cae-4c4e-9740-3459f59a6e05-certs\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.677375 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dzb5\" (UniqueName: \"kubernetes.io/projected/5da0e296-5cae-4c4e-9740-3459f59a6e05-kube-api-access-4dzb5\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.677419 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.677440 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.677505 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-scripts\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.677535 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-config-data\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.684436 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-vbm9s"] Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.779136 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.779467 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dzb5\" (UniqueName: \"kubernetes.io/projected/5da0e296-5cae-4c4e-9740-3459f59a6e05-kube-api-access-4dzb5\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.779493 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.779532 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-config\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.779552 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.779569 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.779616 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sld7\" (UniqueName: \"kubernetes.io/projected/d9909000-8e99-44f7-9f35-570757a60e59-kube-api-access-5sld7\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.779657 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-scripts\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.779675 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.779707 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-config-data\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.779725 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5da0e296-5cae-4c4e-9740-3459f59a6e05-certs\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.779741 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-dns-svc\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.785689 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.785705 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-scripts\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.785727 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5da0e296-5cae-4c4e-9740-3459f59a6e05-certs\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.785786 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.786002 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-config-data\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.797406 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dzb5\" (UniqueName: \"kubernetes.io/projected/5da0e296-5cae-4c4e-9740-3459f59a6e05-kube-api-access-4dzb5\") pod \"cloudkitty-proc-0\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.856660 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.880865 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.880955 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-dns-svc\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.881766 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.881802 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-dns-svc\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.880998 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.881905 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.881911 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.881982 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-config\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.882561 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-config\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.882639 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sld7\" (UniqueName: \"kubernetes.io/projected/d9909000-8e99-44f7-9f35-570757a60e59-kube-api-access-5sld7\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.882825 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.901064 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sld7\" (UniqueName: \"kubernetes.io/projected/d9909000-8e99-44f7-9f35-570757a60e59-kube-api-access-5sld7\") pod \"dnsmasq-dns-67bdc55879-vbm9s\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.943678 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.946061 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.959829 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.964702 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:22:34 crc kubenswrapper[4756]: I0318 14:22:34.989963 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.089422 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.089497 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-scripts\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.089521 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-config-data\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.089633 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gktzt\" (UniqueName: \"kubernetes.io/projected/7b9a812f-0fcb-4212-81cb-154b37fae939-kube-api-access-gktzt\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.089668 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.089697 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9a812f-0fcb-4212-81cb-154b37fae939-logs\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.089766 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7b9a812f-0fcb-4212-81cb-154b37fae939-certs\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.189750 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.191567 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gktzt\" (UniqueName: \"kubernetes.io/projected/7b9a812f-0fcb-4212-81cb-154b37fae939-kube-api-access-gktzt\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.191629 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.191661 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9a812f-0fcb-4212-81cb-154b37fae939-logs\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.191737 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7b9a812f-0fcb-4212-81cb-154b37fae939-certs\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.191787 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.191828 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-scripts\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.191850 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-config-data\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.192882 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9a812f-0fcb-4212-81cb-154b37fae939-logs\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.196299 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7b9a812f-0fcb-4212-81cb-154b37fae939-certs\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.197954 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-config-data\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.200001 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.201311 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.217687 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-scripts\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.217716 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gktzt\" (UniqueName: \"kubernetes.io/projected/7b9a812f-0fcb-4212-81cb-154b37fae939-kube-api-access-gktzt\") pod \"cloudkitty-api-0\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.292874 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzhlg\" (UniqueName: \"kubernetes.io/projected/f59ed4a2-e058-4c0f-acec-68e717c8da59-kube-api-access-tzhlg\") pod \"f59ed4a2-e058-4c0f-acec-68e717c8da59\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.292945 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-dns-swift-storage-0\") pod \"f59ed4a2-e058-4c0f-acec-68e717c8da59\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.293042 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-config\") pod \"f59ed4a2-e058-4c0f-acec-68e717c8da59\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.293082 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-ovsdbserver-sb\") pod \"f59ed4a2-e058-4c0f-acec-68e717c8da59\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.293152 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-ovsdbserver-nb\") pod \"f59ed4a2-e058-4c0f-acec-68e717c8da59\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.293205 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-dns-svc\") pod \"f59ed4a2-e058-4c0f-acec-68e717c8da59\" (UID: \"f59ed4a2-e058-4c0f-acec-68e717c8da59\") " Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.301641 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f59ed4a2-e058-4c0f-acec-68e717c8da59-kube-api-access-tzhlg" (OuterVolumeSpecName: "kube-api-access-tzhlg") pod "f59ed4a2-e058-4c0f-acec-68e717c8da59" (UID: "f59ed4a2-e058-4c0f-acec-68e717c8da59"). InnerVolumeSpecName "kube-api-access-tzhlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.358980 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f59ed4a2-e058-4c0f-acec-68e717c8da59" (UID: "f59ed4a2-e058-4c0f-acec-68e717c8da59"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.369234 4756 generic.go:334] "Generic (PLEG): container finished" podID="f59ed4a2-e058-4c0f-acec-68e717c8da59" containerID="dfc1ce6a8e27d688914c58c1e367e829a7de76ff4d25ad0dfa26724b42b5c27a" exitCode=0 Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.369289 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.369291 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" event={"ID":"f59ed4a2-e058-4c0f-acec-68e717c8da59","Type":"ContainerDied","Data":"dfc1ce6a8e27d688914c58c1e367e829a7de76ff4d25ad0dfa26724b42b5c27a"} Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.369344 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-px8hg" event={"ID":"f59ed4a2-e058-4c0f-acec-68e717c8da59","Type":"ContainerDied","Data":"7fda424f3b7ee3b4819084e0e00fcfef912b7672dabe8a0dda79cef3e2a69df8"} Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.369365 4756 scope.go:117] "RemoveContainer" containerID="dfc1ce6a8e27d688914c58c1e367e829a7de76ff4d25ad0dfa26724b42b5c27a" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.395861 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.395892 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzhlg\" (UniqueName: \"kubernetes.io/projected/f59ed4a2-e058-4c0f-acec-68e717c8da59-kube-api-access-tzhlg\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.399733 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f59ed4a2-e058-4c0f-acec-68e717c8da59" (UID: "f59ed4a2-e058-4c0f-acec-68e717c8da59"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.408026 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f59ed4a2-e058-4c0f-acec-68e717c8da59" (UID: "f59ed4a2-e058-4c0f-acec-68e717c8da59"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.411089 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-config" (OuterVolumeSpecName: "config") pod "f59ed4a2-e058-4c0f-acec-68e717c8da59" (UID: "f59ed4a2-e058-4c0f-acec-68e717c8da59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.418428 4756 scope.go:117] "RemoveContainer" containerID="1ff93633ac3b1e12e773533c04ffe9217a2942b4623d4bae8da29d1ac504c444" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.422369 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f59ed4a2-e058-4c0f-acec-68e717c8da59" (UID: "f59ed4a2-e058-4c0f-acec-68e717c8da59"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.430077 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.478236 4756 scope.go:117] "RemoveContainer" containerID="dfc1ce6a8e27d688914c58c1e367e829a7de76ff4d25ad0dfa26724b42b5c27a" Mar 18 14:22:35 crc kubenswrapper[4756]: E0318 14:22:35.478975 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfc1ce6a8e27d688914c58c1e367e829a7de76ff4d25ad0dfa26724b42b5c27a\": container with ID starting with dfc1ce6a8e27d688914c58c1e367e829a7de76ff4d25ad0dfa26724b42b5c27a not found: ID does not exist" containerID="dfc1ce6a8e27d688914c58c1e367e829a7de76ff4d25ad0dfa26724b42b5c27a" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.479010 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfc1ce6a8e27d688914c58c1e367e829a7de76ff4d25ad0dfa26724b42b5c27a"} err="failed to get container status \"dfc1ce6a8e27d688914c58c1e367e829a7de76ff4d25ad0dfa26724b42b5c27a\": rpc error: code = NotFound desc = could not find container \"dfc1ce6a8e27d688914c58c1e367e829a7de76ff4d25ad0dfa26724b42b5c27a\": container with ID starting with dfc1ce6a8e27d688914c58c1e367e829a7de76ff4d25ad0dfa26724b42b5c27a not found: ID does not exist" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.479029 4756 scope.go:117] "RemoveContainer" containerID="1ff93633ac3b1e12e773533c04ffe9217a2942b4623d4bae8da29d1ac504c444" Mar 18 14:22:35 crc kubenswrapper[4756]: E0318 14:22:35.479289 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff93633ac3b1e12e773533c04ffe9217a2942b4623d4bae8da29d1ac504c444\": container with ID starting with 1ff93633ac3b1e12e773533c04ffe9217a2942b4623d4bae8da29d1ac504c444 not found: ID does not exist" containerID="1ff93633ac3b1e12e773533c04ffe9217a2942b4623d4bae8da29d1ac504c444" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.479308 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff93633ac3b1e12e773533c04ffe9217a2942b4623d4bae8da29d1ac504c444"} err="failed to get container status \"1ff93633ac3b1e12e773533c04ffe9217a2942b4623d4bae8da29d1ac504c444\": rpc error: code = NotFound desc = could not find container \"1ff93633ac3b1e12e773533c04ffe9217a2942b4623d4bae8da29d1ac504c444\": container with ID starting with 1ff93633ac3b1e12e773533c04ffe9217a2942b4623d4bae8da29d1ac504c444 not found: ID does not exist" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.482223 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.497988 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.498042 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.498052 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.498061 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f59ed4a2-e058-4c0f-acec-68e717c8da59-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.565895 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-vbm9s"] Mar 18 14:22:35 crc kubenswrapper[4756]: W0318 14:22:35.576361 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9909000_8e99_44f7_9f35_570757a60e59.slice/crio-efa440667c92c35928da2a2d751c3979a34d7a8afeea0cb819535dba3f80ac8c WatchSource:0}: Error finding container efa440667c92c35928da2a2d751c3979a34d7a8afeea0cb819535dba3f80ac8c: Status 404 returned error can't find the container with id efa440667c92c35928da2a2d751c3979a34d7a8afeea0cb819535dba3f80ac8c Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.728405 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-px8hg"] Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.740653 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-px8hg"] Mar 18 14:22:35 crc kubenswrapper[4756]: I0318 14:22:35.982301 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.391719 4756 generic.go:334] "Generic (PLEG): container finished" podID="d9909000-8e99-44f7-9f35-570757a60e59" containerID="301dc2b8c0a915aca527b5629fe8d1ee84f9ca9a6fcb54a44f0a3c237a5cb2aa" exitCode=0 Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.392226 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" event={"ID":"d9909000-8e99-44f7-9f35-570757a60e59","Type":"ContainerDied","Data":"301dc2b8c0a915aca527b5629fe8d1ee84f9ca9a6fcb54a44f0a3c237a5cb2aa"} Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.392253 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" event={"ID":"d9909000-8e99-44f7-9f35-570757a60e59","Type":"ContainerStarted","Data":"efa440667c92c35928da2a2d751c3979a34d7a8afeea0cb819535dba3f80ac8c"} Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.399372 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"5da0e296-5cae-4c4e-9740-3459f59a6e05","Type":"ContainerStarted","Data":"7cfb911adcb40ec8bc1083359eab597c541387cb2fff1dd8e78307d22b762825"} Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.427338 4756 generic.go:334] "Generic (PLEG): container finished" podID="5c5360dd-3416-47bb-9e45-b8517121fd45" containerID="9a9e9240c04440aab63f8bb3a1f0c3b56e111402c99293eab5524b0c7fbcd6d1" exitCode=0 Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.427399 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c574c99c-tvw8d" event={"ID":"5c5360dd-3416-47bb-9e45-b8517121fd45","Type":"ContainerDied","Data":"9a9e9240c04440aab63f8bb3a1f0c3b56e111402c99293eab5524b0c7fbcd6d1"} Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.429460 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"7b9a812f-0fcb-4212-81cb-154b37fae939","Type":"ContainerStarted","Data":"6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51"} Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.429484 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"7b9a812f-0fcb-4212-81cb-154b37fae939","Type":"ContainerStarted","Data":"0c70dcbd739e8222c79f9d2282c6890ef3ead1fbe9f5cf1b161c875e32df5a97"} Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.654053 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.721384 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-internal-tls-certs\") pod \"5c5360dd-3416-47bb-9e45-b8517121fd45\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.721525 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-ovndb-tls-certs\") pod \"5c5360dd-3416-47bb-9e45-b8517121fd45\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.721578 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-config\") pod \"5c5360dd-3416-47bb-9e45-b8517121fd45\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.721615 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd2wn\" (UniqueName: \"kubernetes.io/projected/5c5360dd-3416-47bb-9e45-b8517121fd45-kube-api-access-kd2wn\") pod \"5c5360dd-3416-47bb-9e45-b8517121fd45\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.721661 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-httpd-config\") pod \"5c5360dd-3416-47bb-9e45-b8517121fd45\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.721802 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-combined-ca-bundle\") pod \"5c5360dd-3416-47bb-9e45-b8517121fd45\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.721832 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-public-tls-certs\") pod \"5c5360dd-3416-47bb-9e45-b8517121fd45\" (UID: \"5c5360dd-3416-47bb-9e45-b8517121fd45\") " Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.726542 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5c5360dd-3416-47bb-9e45-b8517121fd45" (UID: "5c5360dd-3416-47bb-9e45-b8517121fd45"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.733095 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c5360dd-3416-47bb-9e45-b8517121fd45-kube-api-access-kd2wn" (OuterVolumeSpecName: "kube-api-access-kd2wn") pod "5c5360dd-3416-47bb-9e45-b8517121fd45" (UID: "5c5360dd-3416-47bb-9e45-b8517121fd45"). InnerVolumeSpecName "kube-api-access-kd2wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.826441 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd2wn\" (UniqueName: \"kubernetes.io/projected/5c5360dd-3416-47bb-9e45-b8517121fd45-kube-api-access-kd2wn\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.826475 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.844934 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c5360dd-3416-47bb-9e45-b8517121fd45" (UID: "5c5360dd-3416-47bb-9e45-b8517121fd45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.849239 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5c5360dd-3416-47bb-9e45-b8517121fd45" (UID: "5c5360dd-3416-47bb-9e45-b8517121fd45"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.877796 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-config" (OuterVolumeSpecName: "config") pod "5c5360dd-3416-47bb-9e45-b8517121fd45" (UID: "5c5360dd-3416-47bb-9e45-b8517121fd45"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.879228 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5c5360dd-3416-47bb-9e45-b8517121fd45" (UID: "5c5360dd-3416-47bb-9e45-b8517121fd45"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.886249 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5c5360dd-3416-47bb-9e45-b8517121fd45" (UID: "5c5360dd-3416-47bb-9e45-b8517121fd45"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.915570 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.915638 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.915684 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.916499 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9727a0a3407fffdca1e788ab9dbb2c6a316b6b73611747d0e9dff150fec50fa4"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.916579 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://9727a0a3407fffdca1e788ab9dbb2c6a316b6b73611747d0e9dff150fec50fa4" gracePeriod=600 Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.929098 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.929138 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.929147 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.929155 4756 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:36 crc kubenswrapper[4756]: I0318 14:22:36.929165 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c5360dd-3416-47bb-9e45-b8517121fd45-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.201436 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b67498b8b-xjbnx" podUID="63030e19-cf53-4766-aeb1-e25be96a8652" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.186:9311/healthcheck\": read tcp 10.217.0.2:46420->10.217.0.186:9311: read: connection reset by peer" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.201986 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b67498b8b-xjbnx" podUID="63030e19-cf53-4766-aeb1-e25be96a8652" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.186:9311/healthcheck\": read tcp 10.217.0.2:46430->10.217.0.186:9311: read: connection reset by peer" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.326706 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f59ed4a2-e058-4c0f-acec-68e717c8da59" path="/var/lib/kubelet/pods/f59ed4a2-e058-4c0f-acec-68e717c8da59/volumes" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.447082 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"5da0e296-5cae-4c4e-9740-3459f59a6e05","Type":"ContainerStarted","Data":"454aa0b123e95b1eda0ece5fdec8fd5e43abb1a97b58ed9b55a5000ebda738c8"} Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.466367 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c574c99c-tvw8d" event={"ID":"5c5360dd-3416-47bb-9e45-b8517121fd45","Type":"ContainerDied","Data":"420ff5ffda4199d48c2f06213082807802ad4f5b0b7ce9655413b4e28b676b25"} Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.466420 4756 scope.go:117] "RemoveContainer" containerID="5f31b8b654771aa8604a532288d6818a726c175229175cbaafe4ea4b741b7df0" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.466506 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c574c99c-tvw8d" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.466957 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.203813437 podStartE2EDuration="3.466939391s" podCreationTimestamp="2026-03-18 14:22:34 +0000 UTC" firstStartedPulling="2026-03-18 14:22:35.437323102 +0000 UTC m=+1356.751741077" lastFinishedPulling="2026-03-18 14:22:36.700449066 +0000 UTC m=+1358.014867031" observedRunningTime="2026-03-18 14:22:37.461985367 +0000 UTC m=+1358.776403342" watchObservedRunningTime="2026-03-18 14:22:37.466939391 +0000 UTC m=+1358.781357366" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.475944 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="9727a0a3407fffdca1e788ab9dbb2c6a316b6b73611747d0e9dff150fec50fa4" exitCode=0 Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.476024 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"9727a0a3407fffdca1e788ab9dbb2c6a316b6b73611747d0e9dff150fec50fa4"} Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.476052 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"434d4b042e3195c964b5c61982de1a71dbd601937856a288545bb36d7cbe0017"} Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.507335 4756 generic.go:334] "Generic (PLEG): container finished" podID="63030e19-cf53-4766-aeb1-e25be96a8652" containerID="9dfd38e589c690724b745aad477d20369b623572a57e28b404a808f7c7437b07" exitCode=0 Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.507450 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b67498b8b-xjbnx" event={"ID":"63030e19-cf53-4766-aeb1-e25be96a8652","Type":"ContainerDied","Data":"9dfd38e589c690724b745aad477d20369b623572a57e28b404a808f7c7437b07"} Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.509180 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c574c99c-tvw8d"] Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.516755 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"7b9a812f-0fcb-4212-81cb-154b37fae939","Type":"ContainerStarted","Data":"a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5"} Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.517384 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.519287 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" event={"ID":"d9909000-8e99-44f7-9f35-570757a60e59","Type":"ContainerStarted","Data":"5d9d11cf69fb28b9cd08dc8e149bc3293ca7150751413dc8440cd248ace5a13b"} Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.519559 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.541082 4756 scope.go:117] "RemoveContainer" containerID="9a9e9240c04440aab63f8bb3a1f0c3b56e111402c99293eab5524b0c7fbcd6d1" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.545905 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5c574c99c-tvw8d"] Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.575874 4756 scope.go:117] "RemoveContainer" containerID="cdecc63cb2f22e85e1b8370b385518ac960137ad6026c38922b0bcf6a6e1374a" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.588521 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.588504588 podStartE2EDuration="3.588504588s" podCreationTimestamp="2026-03-18 14:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:37.534408065 +0000 UTC m=+1358.848826040" watchObservedRunningTime="2026-03-18 14:22:37.588504588 +0000 UTC m=+1358.902922563" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.597082 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" podStartSLOduration=3.5970635890000002 podStartE2EDuration="3.597063589s" podCreationTimestamp="2026-03-18 14:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:37.555672391 +0000 UTC m=+1358.870090356" watchObservedRunningTime="2026-03-18 14:22:37.597063589 +0000 UTC m=+1358.911481564" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.640642 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.749706 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-config-data\") pod \"63030e19-cf53-4766-aeb1-e25be96a8652\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.749768 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63030e19-cf53-4766-aeb1-e25be96a8652-logs\") pod \"63030e19-cf53-4766-aeb1-e25be96a8652\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.749888 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-combined-ca-bundle\") pod \"63030e19-cf53-4766-aeb1-e25be96a8652\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.750029 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7sd8\" (UniqueName: \"kubernetes.io/projected/63030e19-cf53-4766-aeb1-e25be96a8652-kube-api-access-d7sd8\") pod \"63030e19-cf53-4766-aeb1-e25be96a8652\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.750059 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-config-data-custom\") pod \"63030e19-cf53-4766-aeb1-e25be96a8652\" (UID: \"63030e19-cf53-4766-aeb1-e25be96a8652\") " Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.750489 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63030e19-cf53-4766-aeb1-e25be96a8652-logs" (OuterVolumeSpecName: "logs") pod "63030e19-cf53-4766-aeb1-e25be96a8652" (UID: "63030e19-cf53-4766-aeb1-e25be96a8652"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.756897 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "63030e19-cf53-4766-aeb1-e25be96a8652" (UID: "63030e19-cf53-4766-aeb1-e25be96a8652"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.757062 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63030e19-cf53-4766-aeb1-e25be96a8652-kube-api-access-d7sd8" (OuterVolumeSpecName: "kube-api-access-d7sd8") pod "63030e19-cf53-4766-aeb1-e25be96a8652" (UID: "63030e19-cf53-4766-aeb1-e25be96a8652"). InnerVolumeSpecName "kube-api-access-d7sd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.795297 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63030e19-cf53-4766-aeb1-e25be96a8652" (UID: "63030e19-cf53-4766-aeb1-e25be96a8652"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.816291 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-config-data" (OuterVolumeSpecName: "config-data") pod "63030e19-cf53-4766-aeb1-e25be96a8652" (UID: "63030e19-cf53-4766-aeb1-e25be96a8652"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.852252 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7sd8\" (UniqueName: \"kubernetes.io/projected/63030e19-cf53-4766-aeb1-e25be96a8652-kube-api-access-d7sd8\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.852538 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.852600 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.852660 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63030e19-cf53-4766-aeb1-e25be96a8652-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.852714 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63030e19-cf53-4766-aeb1-e25be96a8652-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.959797 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:22:37 crc kubenswrapper[4756]: I0318 14:22:37.981159 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:22:38 crc kubenswrapper[4756]: I0318 14:22:38.496219 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 14:22:38 crc kubenswrapper[4756]: I0318 14:22:38.533339 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b67498b8b-xjbnx" event={"ID":"63030e19-cf53-4766-aeb1-e25be96a8652","Type":"ContainerDied","Data":"3fa843f5c2188d4bae25148714f1eef63b43a8f7d1ed344e3c719b06b53593ae"} Mar 18 14:22:38 crc kubenswrapper[4756]: I0318 14:22:38.533380 4756 scope.go:117] "RemoveContainer" containerID="9dfd38e589c690724b745aad477d20369b623572a57e28b404a808f7c7437b07" Mar 18 14:22:38 crc kubenswrapper[4756]: I0318 14:22:38.533474 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b67498b8b-xjbnx" Mar 18 14:22:38 crc kubenswrapper[4756]: I0318 14:22:38.565258 4756 scope.go:117] "RemoveContainer" containerID="8c2274ece5ebc8a4a190815d4ce05a9cb2a3dc9db13407bf3148be14c303c1b0" Mar 18 14:22:38 crc kubenswrapper[4756]: I0318 14:22:38.582151 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 14:22:38 crc kubenswrapper[4756]: I0318 14:22:38.582470 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="44ffe1f5-1bf5-4597-946e-af93cafc35e4" containerName="cinder-scheduler" containerID="cri-o://57f970cd2f535c245f5098bb2d2f042956c4b8735df33702323e9f0d6b066cbb" gracePeriod=30 Mar 18 14:22:38 crc kubenswrapper[4756]: I0318 14:22:38.582887 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="44ffe1f5-1bf5-4597-946e-af93cafc35e4" containerName="probe" containerID="cri-o://d57ad1ac9116b5cd3b2e37e876daff68f2ecb8deff212532ba0e1820ad42fb08" gracePeriod=30 Mar 18 14:22:38 crc kubenswrapper[4756]: I0318 14:22:38.593165 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b67498b8b-xjbnx"] Mar 18 14:22:38 crc kubenswrapper[4756]: I0318 14:22:38.606630 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6b67498b8b-xjbnx"] Mar 18 14:22:39 crc kubenswrapper[4756]: I0318 14:22:39.338049 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c5360dd-3416-47bb-9e45-b8517121fd45" path="/var/lib/kubelet/pods/5c5360dd-3416-47bb-9e45-b8517121fd45/volumes" Mar 18 14:22:39 crc kubenswrapper[4756]: I0318 14:22:39.339662 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63030e19-cf53-4766-aeb1-e25be96a8652" path="/var/lib/kubelet/pods/63030e19-cf53-4766-aeb1-e25be96a8652/volumes" Mar 18 14:22:39 crc kubenswrapper[4756]: I0318 14:22:39.548065 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="5da0e296-5cae-4c4e-9740-3459f59a6e05" containerName="cloudkitty-proc" containerID="cri-o://454aa0b123e95b1eda0ece5fdec8fd5e43abb1a97b58ed9b55a5000ebda738c8" gracePeriod=30 Mar 18 14:22:39 crc kubenswrapper[4756]: I0318 14:22:39.548237 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="7b9a812f-0fcb-4212-81cb-154b37fae939" containerName="cloudkitty-api-log" containerID="cri-o://6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51" gracePeriod=30 Mar 18 14:22:39 crc kubenswrapper[4756]: I0318 14:22:39.548267 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="7b9a812f-0fcb-4212-81cb-154b37fae939" containerName="cloudkitty-api" containerID="cri-o://a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5" gracePeriod=30 Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.144872 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.205157 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7b9a812f-0fcb-4212-81cb-154b37fae939-certs\") pod \"7b9a812f-0fcb-4212-81cb-154b37fae939\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.205255 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gktzt\" (UniqueName: \"kubernetes.io/projected/7b9a812f-0fcb-4212-81cb-154b37fae939-kube-api-access-gktzt\") pod \"7b9a812f-0fcb-4212-81cb-154b37fae939\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.205386 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-config-data-custom\") pod \"7b9a812f-0fcb-4212-81cb-154b37fae939\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.205417 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-config-data\") pod \"7b9a812f-0fcb-4212-81cb-154b37fae939\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.205446 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9a812f-0fcb-4212-81cb-154b37fae939-logs\") pod \"7b9a812f-0fcb-4212-81cb-154b37fae939\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.205486 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-scripts\") pod \"7b9a812f-0fcb-4212-81cb-154b37fae939\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.205507 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-combined-ca-bundle\") pod \"7b9a812f-0fcb-4212-81cb-154b37fae939\" (UID: \"7b9a812f-0fcb-4212-81cb-154b37fae939\") " Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.206972 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b9a812f-0fcb-4212-81cb-154b37fae939-logs" (OuterVolumeSpecName: "logs") pod "7b9a812f-0fcb-4212-81cb-154b37fae939" (UID: "7b9a812f-0fcb-4212-81cb-154b37fae939"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.211529 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-scripts" (OuterVolumeSpecName: "scripts") pod "7b9a812f-0fcb-4212-81cb-154b37fae939" (UID: "7b9a812f-0fcb-4212-81cb-154b37fae939"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.211575 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7b9a812f-0fcb-4212-81cb-154b37fae939" (UID: "7b9a812f-0fcb-4212-81cb-154b37fae939"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.215285 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9a812f-0fcb-4212-81cb-154b37fae939-kube-api-access-gktzt" (OuterVolumeSpecName: "kube-api-access-gktzt") pod "7b9a812f-0fcb-4212-81cb-154b37fae939" (UID: "7b9a812f-0fcb-4212-81cb-154b37fae939"). InnerVolumeSpecName "kube-api-access-gktzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.215342 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9a812f-0fcb-4212-81cb-154b37fae939-certs" (OuterVolumeSpecName: "certs") pod "7b9a812f-0fcb-4212-81cb-154b37fae939" (UID: "7b9a812f-0fcb-4212-81cb-154b37fae939"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.235203 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b9a812f-0fcb-4212-81cb-154b37fae939" (UID: "7b9a812f-0fcb-4212-81cb-154b37fae939"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.250361 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-config-data" (OuterVolumeSpecName: "config-data") pod "7b9a812f-0fcb-4212-81cb-154b37fae939" (UID: "7b9a812f-0fcb-4212-81cb-154b37fae939"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.307795 4756 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/7b9a812f-0fcb-4212-81cb-154b37fae939-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.307829 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gktzt\" (UniqueName: \"kubernetes.io/projected/7b9a812f-0fcb-4212-81cb-154b37fae939-kube-api-access-gktzt\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.307840 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.307849 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.307858 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9a812f-0fcb-4212-81cb-154b37fae939-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.307869 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.307877 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9a812f-0fcb-4212-81cb-154b37fae939-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.559523 4756 generic.go:334] "Generic (PLEG): container finished" podID="44ffe1f5-1bf5-4597-946e-af93cafc35e4" containerID="d57ad1ac9116b5cd3b2e37e876daff68f2ecb8deff212532ba0e1820ad42fb08" exitCode=0 Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.559615 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"44ffe1f5-1bf5-4597-946e-af93cafc35e4","Type":"ContainerDied","Data":"d57ad1ac9116b5cd3b2e37e876daff68f2ecb8deff212532ba0e1820ad42fb08"} Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.563738 4756 generic.go:334] "Generic (PLEG): container finished" podID="7b9a812f-0fcb-4212-81cb-154b37fae939" containerID="a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5" exitCode=0 Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.563806 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"7b9a812f-0fcb-4212-81cb-154b37fae939","Type":"ContainerDied","Data":"a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5"} Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.563850 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"7b9a812f-0fcb-4212-81cb-154b37fae939","Type":"ContainerDied","Data":"6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51"} Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.563784 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.563818 4756 generic.go:334] "Generic (PLEG): container finished" podID="7b9a812f-0fcb-4212-81cb-154b37fae939" containerID="6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51" exitCode=143 Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.563915 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"7b9a812f-0fcb-4212-81cb-154b37fae939","Type":"ContainerDied","Data":"0c70dcbd739e8222c79f9d2282c6890ef3ead1fbe9f5cf1b161c875e32df5a97"} Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.563921 4756 scope.go:117] "RemoveContainer" containerID="a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.594326 4756 scope.go:117] "RemoveContainer" containerID="6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.613319 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.632973 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.649323 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:22:40 crc kubenswrapper[4756]: E0318 14:22:40.649952 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f59ed4a2-e058-4c0f-acec-68e717c8da59" containerName="dnsmasq-dns" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.649981 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59ed4a2-e058-4c0f-acec-68e717c8da59" containerName="dnsmasq-dns" Mar 18 14:22:40 crc kubenswrapper[4756]: E0318 14:22:40.650009 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f59ed4a2-e058-4c0f-acec-68e717c8da59" containerName="init" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.650022 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59ed4a2-e058-4c0f-acec-68e717c8da59" containerName="init" Mar 18 14:22:40 crc kubenswrapper[4756]: E0318 14:22:40.650048 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9a812f-0fcb-4212-81cb-154b37fae939" containerName="cloudkitty-api" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.650061 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9a812f-0fcb-4212-81cb-154b37fae939" containerName="cloudkitty-api" Mar 18 14:22:40 crc kubenswrapper[4756]: E0318 14:22:40.650092 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9a812f-0fcb-4212-81cb-154b37fae939" containerName="cloudkitty-api-log" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.650104 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9a812f-0fcb-4212-81cb-154b37fae939" containerName="cloudkitty-api-log" Mar 18 14:22:40 crc kubenswrapper[4756]: E0318 14:22:40.650225 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63030e19-cf53-4766-aeb1-e25be96a8652" containerName="barbican-api-log" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.650307 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="63030e19-cf53-4766-aeb1-e25be96a8652" containerName="barbican-api-log" Mar 18 14:22:40 crc kubenswrapper[4756]: E0318 14:22:40.650345 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63030e19-cf53-4766-aeb1-e25be96a8652" containerName="barbican-api" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.650363 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="63030e19-cf53-4766-aeb1-e25be96a8652" containerName="barbican-api" Mar 18 14:22:40 crc kubenswrapper[4756]: E0318 14:22:40.650412 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c5360dd-3416-47bb-9e45-b8517121fd45" containerName="neutron-httpd" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.650429 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c5360dd-3416-47bb-9e45-b8517121fd45" containerName="neutron-httpd" Mar 18 14:22:40 crc kubenswrapper[4756]: E0318 14:22:40.650464 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c5360dd-3416-47bb-9e45-b8517121fd45" containerName="neutron-api" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.650479 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c5360dd-3416-47bb-9e45-b8517121fd45" containerName="neutron-api" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.650936 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="63030e19-cf53-4766-aeb1-e25be96a8652" containerName="barbican-api-log" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.650968 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c5360dd-3416-47bb-9e45-b8517121fd45" containerName="neutron-api" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.650993 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f59ed4a2-e058-4c0f-acec-68e717c8da59" containerName="dnsmasq-dns" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.651024 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9a812f-0fcb-4212-81cb-154b37fae939" containerName="cloudkitty-api" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.651054 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="63030e19-cf53-4766-aeb1-e25be96a8652" containerName="barbican-api" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.651098 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c5360dd-3416-47bb-9e45-b8517121fd45" containerName="neutron-httpd" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.651155 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9a812f-0fcb-4212-81cb-154b37fae939" containerName="cloudkitty-api-log" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.653719 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.659016 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.662651 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.662986 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.663106 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.681229 4756 scope.go:117] "RemoveContainer" containerID="a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5" Mar 18 14:22:40 crc kubenswrapper[4756]: E0318 14:22:40.683006 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5\": container with ID starting with a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5 not found: ID does not exist" containerID="a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.683048 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5"} err="failed to get container status \"a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5\": rpc error: code = NotFound desc = could not find container \"a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5\": container with ID starting with a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5 not found: ID does not exist" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.683075 4756 scope.go:117] "RemoveContainer" containerID="6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51" Mar 18 14:22:40 crc kubenswrapper[4756]: E0318 14:22:40.683492 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51\": container with ID starting with 6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51 not found: ID does not exist" containerID="6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.683534 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51"} err="failed to get container status \"6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51\": rpc error: code = NotFound desc = could not find container \"6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51\": container with ID starting with 6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51 not found: ID does not exist" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.683564 4756 scope.go:117] "RemoveContainer" containerID="a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.683980 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5"} err="failed to get container status \"a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5\": rpc error: code = NotFound desc = could not find container \"a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5\": container with ID starting with a4b24600f1ed0a8b6525a915869a73b4902dee988c61248353844c110b2bb7d5 not found: ID does not exist" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.684015 4756 scope.go:117] "RemoveContainer" containerID="6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.684348 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51"} err="failed to get container status \"6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51\": rpc error: code = NotFound desc = could not find container \"6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51\": container with ID starting with 6a53086be42ce2310eaf1e19ec9e0a3742ebc869686e1dd78fb4a48fd9749f51 not found: ID does not exist" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.717158 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cd47c362-dbd1-40b7-8f54-b967d5998fcd-certs\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.717232 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-config-data\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.717280 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.717302 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-scripts\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.717373 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8sl7\" (UniqueName: \"kubernetes.io/projected/cd47c362-dbd1-40b7-8f54-b967d5998fcd-kube-api-access-j8sl7\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.717452 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.717500 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.717838 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.717903 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd47c362-dbd1-40b7-8f54-b967d5998fcd-logs\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.820230 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cd47c362-dbd1-40b7-8f54-b967d5998fcd-certs\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.820344 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-config-data\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.820428 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.820484 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-scripts\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.820517 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8sl7\" (UniqueName: \"kubernetes.io/projected/cd47c362-dbd1-40b7-8f54-b967d5998fcd-kube-api-access-j8sl7\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.820548 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.820585 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.820724 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.820779 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd47c362-dbd1-40b7-8f54-b967d5998fcd-logs\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.821263 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd47c362-dbd1-40b7-8f54-b967d5998fcd-logs\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.828250 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.828846 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cd47c362-dbd1-40b7-8f54-b967d5998fcd-certs\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.832814 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-scripts\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.833408 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.833409 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.833682 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-config-data\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.850743 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.855614 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8sl7\" (UniqueName: \"kubernetes.io/projected/cd47c362-dbd1-40b7-8f54-b967d5998fcd-kube-api-access-j8sl7\") pod \"cloudkitty-api-0\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " pod="openstack/cloudkitty-api-0" Mar 18 14:22:40 crc kubenswrapper[4756]: I0318 14:22:40.984435 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 18 14:22:41 crc kubenswrapper[4756]: I0318 14:22:41.202848 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 14:22:41 crc kubenswrapper[4756]: I0318 14:22:41.327588 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9a812f-0fcb-4212-81cb-154b37fae939" path="/var/lib/kubelet/pods/7b9a812f-0fcb-4212-81cb-154b37fae939/volumes" Mar 18 14:22:41 crc kubenswrapper[4756]: I0318 14:22:41.509189 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:22:41 crc kubenswrapper[4756]: W0318 14:22:41.515866 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd47c362_dbd1_40b7_8f54_b967d5998fcd.slice/crio-bed026cff87645bdc31fc780073c1d7f985818edad8a280b57c9f8b93194e4d3 WatchSource:0}: Error finding container bed026cff87645bdc31fc780073c1d7f985818edad8a280b57c9f8b93194e4d3: Status 404 returned error can't find the container with id bed026cff87645bdc31fc780073c1d7f985818edad8a280b57c9f8b93194e4d3 Mar 18 14:22:41 crc kubenswrapper[4756]: I0318 14:22:41.581266 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"cd47c362-dbd1-40b7-8f54-b967d5998fcd","Type":"ContainerStarted","Data":"bed026cff87645bdc31fc780073c1d7f985818edad8a280b57c9f8b93194e4d3"} Mar 18 14:22:42 crc kubenswrapper[4756]: I0318 14:22:42.602293 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"cd47c362-dbd1-40b7-8f54-b967d5998fcd","Type":"ContainerStarted","Data":"2dac5e6313bd5acf6f685e0a53c4526e61552cb8e569499e8928466e7cc65afc"} Mar 18 14:22:42 crc kubenswrapper[4756]: I0318 14:22:42.602543 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"cd47c362-dbd1-40b7-8f54-b967d5998fcd","Type":"ContainerStarted","Data":"a6651cfd2b97688c0bf583665883ac7fd8b8ad929778ecd5484a983e72694ca7"} Mar 18 14:22:42 crc kubenswrapper[4756]: I0318 14:22:42.602934 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.296502 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.314923 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.314901204 podStartE2EDuration="3.314901204s" podCreationTimestamp="2026-03-18 14:22:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:42.631226048 +0000 UTC m=+1363.945644033" watchObservedRunningTime="2026-03-18 14:22:43.314901204 +0000 UTC m=+1364.629319179" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.372464 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-combined-ca-bundle\") pod \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.372547 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44ffe1f5-1bf5-4597-946e-af93cafc35e4-etc-machine-id\") pod \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.372631 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-config-data\") pod \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.372668 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m47qq\" (UniqueName: \"kubernetes.io/projected/44ffe1f5-1bf5-4597-946e-af93cafc35e4-kube-api-access-m47qq\") pod \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.372738 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-config-data-custom\") pod \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.372868 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-scripts\") pod \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\" (UID: \"44ffe1f5-1bf5-4597-946e-af93cafc35e4\") " Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.374213 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44ffe1f5-1bf5-4597-946e-af93cafc35e4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "44ffe1f5-1bf5-4597-946e-af93cafc35e4" (UID: "44ffe1f5-1bf5-4597-946e-af93cafc35e4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.378530 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "44ffe1f5-1bf5-4597-946e-af93cafc35e4" (UID: "44ffe1f5-1bf5-4597-946e-af93cafc35e4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.379434 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ffe1f5-1bf5-4597-946e-af93cafc35e4-kube-api-access-m47qq" (OuterVolumeSpecName: "kube-api-access-m47qq") pod "44ffe1f5-1bf5-4597-946e-af93cafc35e4" (UID: "44ffe1f5-1bf5-4597-946e-af93cafc35e4"). InnerVolumeSpecName "kube-api-access-m47qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.387512 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-scripts" (OuterVolumeSpecName: "scripts") pod "44ffe1f5-1bf5-4597-946e-af93cafc35e4" (UID: "44ffe1f5-1bf5-4597-946e-af93cafc35e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.426536 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44ffe1f5-1bf5-4597-946e-af93cafc35e4" (UID: "44ffe1f5-1bf5-4597-946e-af93cafc35e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.475360 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.475396 4756 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44ffe1f5-1bf5-4597-946e-af93cafc35e4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.475411 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m47qq\" (UniqueName: \"kubernetes.io/projected/44ffe1f5-1bf5-4597-946e-af93cafc35e4-kube-api-access-m47qq\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.475424 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.475436 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.482993 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-config-data" (OuterVolumeSpecName: "config-data") pod "44ffe1f5-1bf5-4597-946e-af93cafc35e4" (UID: "44ffe1f5-1bf5-4597-946e-af93cafc35e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.576662 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ffe1f5-1bf5-4597-946e-af93cafc35e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.614984 4756 generic.go:334] "Generic (PLEG): container finished" podID="44ffe1f5-1bf5-4597-946e-af93cafc35e4" containerID="57f970cd2f535c245f5098bb2d2f042956c4b8735df33702323e9f0d6b066cbb" exitCode=0 Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.616424 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"44ffe1f5-1bf5-4597-946e-af93cafc35e4","Type":"ContainerDied","Data":"57f970cd2f535c245f5098bb2d2f042956c4b8735df33702323e9f0d6b066cbb"} Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.616473 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"44ffe1f5-1bf5-4597-946e-af93cafc35e4","Type":"ContainerDied","Data":"105a30526ea3488528884279c67353ce9dacb59302ab6410df33432e157a45f4"} Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.616429 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.616490 4756 scope.go:117] "RemoveContainer" containerID="d57ad1ac9116b5cd3b2e37e876daff68f2ecb8deff212532ba0e1820ad42fb08" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.653343 4756 scope.go:117] "RemoveContainer" containerID="57f970cd2f535c245f5098bb2d2f042956c4b8735df33702323e9f0d6b066cbb" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.662956 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.677133 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.685168 4756 scope.go:117] "RemoveContainer" containerID="d57ad1ac9116b5cd3b2e37e876daff68f2ecb8deff212532ba0e1820ad42fb08" Mar 18 14:22:43 crc kubenswrapper[4756]: E0318 14:22:43.686410 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d57ad1ac9116b5cd3b2e37e876daff68f2ecb8deff212532ba0e1820ad42fb08\": container with ID starting with d57ad1ac9116b5cd3b2e37e876daff68f2ecb8deff212532ba0e1820ad42fb08 not found: ID does not exist" containerID="d57ad1ac9116b5cd3b2e37e876daff68f2ecb8deff212532ba0e1820ad42fb08" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.686451 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d57ad1ac9116b5cd3b2e37e876daff68f2ecb8deff212532ba0e1820ad42fb08"} err="failed to get container status \"d57ad1ac9116b5cd3b2e37e876daff68f2ecb8deff212532ba0e1820ad42fb08\": rpc error: code = NotFound desc = could not find container \"d57ad1ac9116b5cd3b2e37e876daff68f2ecb8deff212532ba0e1820ad42fb08\": container with ID starting with d57ad1ac9116b5cd3b2e37e876daff68f2ecb8deff212532ba0e1820ad42fb08 not found: ID does not exist" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.686476 4756 scope.go:117] "RemoveContainer" containerID="57f970cd2f535c245f5098bb2d2f042956c4b8735df33702323e9f0d6b066cbb" Mar 18 14:22:43 crc kubenswrapper[4756]: E0318 14:22:43.686913 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57f970cd2f535c245f5098bb2d2f042956c4b8735df33702323e9f0d6b066cbb\": container with ID starting with 57f970cd2f535c245f5098bb2d2f042956c4b8735df33702323e9f0d6b066cbb not found: ID does not exist" containerID="57f970cd2f535c245f5098bb2d2f042956c4b8735df33702323e9f0d6b066cbb" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.686944 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f970cd2f535c245f5098bb2d2f042956c4b8735df33702323e9f0d6b066cbb"} err="failed to get container status \"57f970cd2f535c245f5098bb2d2f042956c4b8735df33702323e9f0d6b066cbb\": rpc error: code = NotFound desc = could not find container \"57f970cd2f535c245f5098bb2d2f042956c4b8735df33702323e9f0d6b066cbb\": container with ID starting with 57f970cd2f535c245f5098bb2d2f042956c4b8735df33702323e9f0d6b066cbb not found: ID does not exist" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.690546 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 14:22:43 crc kubenswrapper[4756]: E0318 14:22:43.690978 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ffe1f5-1bf5-4597-946e-af93cafc35e4" containerName="cinder-scheduler" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.690994 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ffe1f5-1bf5-4597-946e-af93cafc35e4" containerName="cinder-scheduler" Mar 18 14:22:43 crc kubenswrapper[4756]: E0318 14:22:43.691020 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ffe1f5-1bf5-4597-946e-af93cafc35e4" containerName="probe" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.691026 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ffe1f5-1bf5-4597-946e-af93cafc35e4" containerName="probe" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.691243 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ffe1f5-1bf5-4597-946e-af93cafc35e4" containerName="cinder-scheduler" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.691266 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ffe1f5-1bf5-4597-946e-af93cafc35e4" containerName="probe" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.692316 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.695223 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.705736 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.780070 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.780157 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-config-data\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.780273 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.780484 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6glhc\" (UniqueName: \"kubernetes.io/projected/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-kube-api-access-6glhc\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.780600 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.780743 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-scripts\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.882418 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.882488 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-config-data\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.882548 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.882647 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6glhc\" (UniqueName: \"kubernetes.io/projected/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-kube-api-access-6glhc\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.882716 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.882764 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-scripts\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.883879 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.887664 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-scripts\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.887939 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.888478 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.888701 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-config-data\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:43 crc kubenswrapper[4756]: I0318 14:22:43.902660 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6glhc\" (UniqueName: \"kubernetes.io/projected/4ffb8a9a-4e52-4db8-a22f-994a2cf222cf-kube-api-access-6glhc\") pod \"cinder-scheduler-0\" (UID: \"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf\") " pod="openstack/cinder-scheduler-0" Mar 18 14:22:44 crc kubenswrapper[4756]: I0318 14:22:44.011313 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 14:22:44 crc kubenswrapper[4756]: I0318 14:22:44.598105 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 14:22:44 crc kubenswrapper[4756]: W0318 14:22:44.598385 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ffb8a9a_4e52_4db8_a22f_994a2cf222cf.slice/crio-d1aedf1c44fb5c88fa882c9c3358d79487fbe8c9feb5fe3938d08f2733474906 WatchSource:0}: Error finding container d1aedf1c44fb5c88fa882c9c3358d79487fbe8c9feb5fe3938d08f2733474906: Status 404 returned error can't find the container with id d1aedf1c44fb5c88fa882c9c3358d79487fbe8c9feb5fe3938d08f2733474906 Mar 18 14:22:44 crc kubenswrapper[4756]: I0318 14:22:44.630815 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf","Type":"ContainerStarted","Data":"d1aedf1c44fb5c88fa882c9c3358d79487fbe8c9feb5fe3938d08f2733474906"} Mar 18 14:22:44 crc kubenswrapper[4756]: I0318 14:22:44.992440 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.069426 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9sh7"] Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.069649 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" podUID="a6759509-7c13-49a6-893a-86605058eabc" containerName="dnsmasq-dns" containerID="cri-o://fdc6356e5a4853b64e2f05704d770128b5465e8a47e7f010752fabaa5d058615" gracePeriod=10 Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.343880 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44ffe1f5-1bf5-4597-946e-af93cafc35e4" path="/var/lib/kubelet/pods/44ffe1f5-1bf5-4597-946e-af93cafc35e4/volumes" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.643040 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf","Type":"ContainerStarted","Data":"b45399a02f3e800437c9f0ce63ea1224cf06e5e9b8cf42835eef25681c9d982d"} Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.656990 4756 generic.go:334] "Generic (PLEG): container finished" podID="a6759509-7c13-49a6-893a-86605058eabc" containerID="fdc6356e5a4853b64e2f05704d770128b5465e8a47e7f010752fabaa5d058615" exitCode=0 Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.657036 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" event={"ID":"a6759509-7c13-49a6-893a-86605058eabc","Type":"ContainerDied","Data":"fdc6356e5a4853b64e2f05704d770128b5465e8a47e7f010752fabaa5d058615"} Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.657091 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" event={"ID":"a6759509-7c13-49a6-893a-86605058eabc","Type":"ContainerDied","Data":"92e6cc276b86af550b15b283574207c2e805b6a5bea3b1f6b4944f302da2ecd1"} Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.657104 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92e6cc276b86af550b15b283574207c2e805b6a5bea3b1f6b4944f302da2ecd1" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.689420 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.866324 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-dns-svc\") pod \"a6759509-7c13-49a6-893a-86605058eabc\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.866695 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-dns-swift-storage-0\") pod \"a6759509-7c13-49a6-893a-86605058eabc\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.866803 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-ovsdbserver-sb\") pod \"a6759509-7c13-49a6-893a-86605058eabc\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.866880 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-config\") pod \"a6759509-7c13-49a6-893a-86605058eabc\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.866936 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-ovsdbserver-nb\") pod \"a6759509-7c13-49a6-893a-86605058eabc\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.867005 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkr8m\" (UniqueName: \"kubernetes.io/projected/a6759509-7c13-49a6-893a-86605058eabc-kube-api-access-vkr8m\") pod \"a6759509-7c13-49a6-893a-86605058eabc\" (UID: \"a6759509-7c13-49a6-893a-86605058eabc\") " Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.878537 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6759509-7c13-49a6-893a-86605058eabc-kube-api-access-vkr8m" (OuterVolumeSpecName: "kube-api-access-vkr8m") pod "a6759509-7c13-49a6-893a-86605058eabc" (UID: "a6759509-7c13-49a6-893a-86605058eabc"). InnerVolumeSpecName "kube-api-access-vkr8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.921971 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-config" (OuterVolumeSpecName: "config") pod "a6759509-7c13-49a6-893a-86605058eabc" (UID: "a6759509-7c13-49a6-893a-86605058eabc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.924677 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a6759509-7c13-49a6-893a-86605058eabc" (UID: "a6759509-7c13-49a6-893a-86605058eabc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.940927 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a6759509-7c13-49a6-893a-86605058eabc" (UID: "a6759509-7c13-49a6-893a-86605058eabc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.946751 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6759509-7c13-49a6-893a-86605058eabc" (UID: "a6759509-7c13-49a6-893a-86605058eabc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.947252 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a6759509-7c13-49a6-893a-86605058eabc" (UID: "a6759509-7c13-49a6-893a-86605058eabc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.973616 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.973658 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.973669 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.973677 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkr8m\" (UniqueName: \"kubernetes.io/projected/a6759509-7c13-49a6-893a-86605058eabc-kube-api-access-vkr8m\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.973688 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:45 crc kubenswrapper[4756]: I0318 14:22:45.973697 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6759509-7c13-49a6-893a-86605058eabc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:46 crc kubenswrapper[4756]: I0318 14:22:46.669376 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m9sh7" Mar 18 14:22:46 crc kubenswrapper[4756]: I0318 14:22:46.670518 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4ffb8a9a-4e52-4db8-a22f-994a2cf222cf","Type":"ContainerStarted","Data":"ad42e96372e23447a694bb1447252e369bc66c807d6a3d5024206941f097fe5b"} Mar 18 14:22:46 crc kubenswrapper[4756]: I0318 14:22:46.713106 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.7130826470000002 podStartE2EDuration="3.713082647s" podCreationTimestamp="2026-03-18 14:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:46.704233928 +0000 UTC m=+1368.018651903" watchObservedRunningTime="2026-03-18 14:22:46.713082647 +0000 UTC m=+1368.027500622" Mar 18 14:22:46 crc kubenswrapper[4756]: I0318 14:22:46.739654 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9sh7"] Mar 18 14:22:46 crc kubenswrapper[4756]: I0318 14:22:46.752045 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9sh7"] Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.325823 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6759509-7c13-49a6-893a-86605058eabc" path="/var/lib/kubelet/pods/a6759509-7c13-49a6-893a-86605058eabc/volumes" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.425033 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.601536 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dzb5\" (UniqueName: \"kubernetes.io/projected/5da0e296-5cae-4c4e-9740-3459f59a6e05-kube-api-access-4dzb5\") pod \"5da0e296-5cae-4c4e-9740-3459f59a6e05\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.601590 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-config-data-custom\") pod \"5da0e296-5cae-4c4e-9740-3459f59a6e05\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.601664 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-combined-ca-bundle\") pod \"5da0e296-5cae-4c4e-9740-3459f59a6e05\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.601817 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5da0e296-5cae-4c4e-9740-3459f59a6e05-certs\") pod \"5da0e296-5cae-4c4e-9740-3459f59a6e05\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.601849 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-config-data\") pod \"5da0e296-5cae-4c4e-9740-3459f59a6e05\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.601880 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-scripts\") pod \"5da0e296-5cae-4c4e-9740-3459f59a6e05\" (UID: \"5da0e296-5cae-4c4e-9740-3459f59a6e05\") " Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.609931 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-scripts" (OuterVolumeSpecName: "scripts") pod "5da0e296-5cae-4c4e-9740-3459f59a6e05" (UID: "5da0e296-5cae-4c4e-9740-3459f59a6e05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.611509 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5da0e296-5cae-4c4e-9740-3459f59a6e05" (UID: "5da0e296-5cae-4c4e-9740-3459f59a6e05"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.613939 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da0e296-5cae-4c4e-9740-3459f59a6e05-certs" (OuterVolumeSpecName: "certs") pod "5da0e296-5cae-4c4e-9740-3459f59a6e05" (UID: "5da0e296-5cae-4c4e-9740-3459f59a6e05"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.624465 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da0e296-5cae-4c4e-9740-3459f59a6e05-kube-api-access-4dzb5" (OuterVolumeSpecName: "kube-api-access-4dzb5") pod "5da0e296-5cae-4c4e-9740-3459f59a6e05" (UID: "5da0e296-5cae-4c4e-9740-3459f59a6e05"). InnerVolumeSpecName "kube-api-access-4dzb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.631871 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5da0e296-5cae-4c4e-9740-3459f59a6e05" (UID: "5da0e296-5cae-4c4e-9740-3459f59a6e05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.647315 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-config-data" (OuterVolumeSpecName: "config-data") pod "5da0e296-5cae-4c4e-9740-3459f59a6e05" (UID: "5da0e296-5cae-4c4e-9740-3459f59a6e05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.677795 4756 generic.go:334] "Generic (PLEG): container finished" podID="5da0e296-5cae-4c4e-9740-3459f59a6e05" containerID="454aa0b123e95b1eda0ece5fdec8fd5e43abb1a97b58ed9b55a5000ebda738c8" exitCode=0 Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.677853 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.677857 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"5da0e296-5cae-4c4e-9740-3459f59a6e05","Type":"ContainerDied","Data":"454aa0b123e95b1eda0ece5fdec8fd5e43abb1a97b58ed9b55a5000ebda738c8"} Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.677934 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"5da0e296-5cae-4c4e-9740-3459f59a6e05","Type":"ContainerDied","Data":"7cfb911adcb40ec8bc1083359eab597c541387cb2fff1dd8e78307d22b762825"} Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.677974 4756 scope.go:117] "RemoveContainer" containerID="454aa0b123e95b1eda0ece5fdec8fd5e43abb1a97b58ed9b55a5000ebda738c8" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.703754 4756 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5da0e296-5cae-4c4e-9740-3459f59a6e05-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.703990 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.704000 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.704008 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dzb5\" (UniqueName: \"kubernetes.io/projected/5da0e296-5cae-4c4e-9740-3459f59a6e05-kube-api-access-4dzb5\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.704017 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.704025 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5da0e296-5cae-4c4e-9740-3459f59a6e05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.717571 4756 scope.go:117] "RemoveContainer" containerID="454aa0b123e95b1eda0ece5fdec8fd5e43abb1a97b58ed9b55a5000ebda738c8" Mar 18 14:22:47 crc kubenswrapper[4756]: E0318 14:22:47.718185 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"454aa0b123e95b1eda0ece5fdec8fd5e43abb1a97b58ed9b55a5000ebda738c8\": container with ID starting with 454aa0b123e95b1eda0ece5fdec8fd5e43abb1a97b58ed9b55a5000ebda738c8 not found: ID does not exist" containerID="454aa0b123e95b1eda0ece5fdec8fd5e43abb1a97b58ed9b55a5000ebda738c8" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.718231 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"454aa0b123e95b1eda0ece5fdec8fd5e43abb1a97b58ed9b55a5000ebda738c8"} err="failed to get container status \"454aa0b123e95b1eda0ece5fdec8fd5e43abb1a97b58ed9b55a5000ebda738c8\": rpc error: code = NotFound desc = could not find container \"454aa0b123e95b1eda0ece5fdec8fd5e43abb1a97b58ed9b55a5000ebda738c8\": container with ID starting with 454aa0b123e95b1eda0ece5fdec8fd5e43abb1a97b58ed9b55a5000ebda738c8 not found: ID does not exist" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.724612 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.736208 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.754523 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:22:47 crc kubenswrapper[4756]: E0318 14:22:47.754917 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6759509-7c13-49a6-893a-86605058eabc" containerName="dnsmasq-dns" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.754932 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6759509-7c13-49a6-893a-86605058eabc" containerName="dnsmasq-dns" Mar 18 14:22:47 crc kubenswrapper[4756]: E0318 14:22:47.754949 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6759509-7c13-49a6-893a-86605058eabc" containerName="init" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.754956 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6759509-7c13-49a6-893a-86605058eabc" containerName="init" Mar 18 14:22:47 crc kubenswrapper[4756]: E0318 14:22:47.754965 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da0e296-5cae-4c4e-9740-3459f59a6e05" containerName="cloudkitty-proc" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.754971 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da0e296-5cae-4c4e-9740-3459f59a6e05" containerName="cloudkitty-proc" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.755195 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6759509-7c13-49a6-893a-86605058eabc" containerName="dnsmasq-dns" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.755213 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da0e296-5cae-4c4e-9740-3459f59a6e05" containerName="cloudkitty-proc" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.755911 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.758586 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.761437 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.775099 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.797095 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.907784 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8krnt\" (UniqueName: \"kubernetes.io/projected/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-kube-api-access-8krnt\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.907906 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-scripts\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.907928 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-config-data\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.907951 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-certs\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.908051 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.908249 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:47 crc kubenswrapper[4756]: I0318 14:22:47.952742 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7f64d44848-xg692" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.010304 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-scripts\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.010349 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-config-data\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.010376 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-certs\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.010397 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.010437 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.010539 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8krnt\" (UniqueName: \"kubernetes.io/projected/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-kube-api-access-8krnt\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.023854 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.025901 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-certs\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.026389 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.027089 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-scripts\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.029446 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-config-data\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.046544 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8krnt\" (UniqueName: \"kubernetes.io/projected/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-kube-api-access-8krnt\") pod \"cloudkitty-proc-0\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.048480 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-654d496b7d-zrbn7"] Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.050980 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.063083 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-654d496b7d-zrbn7"] Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.074267 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.216650 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-public-tls-certs\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.216984 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqwwb\" (UniqueName: \"kubernetes.io/projected/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-kube-api-access-vqwwb\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.217009 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-internal-tls-certs\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.217079 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-scripts\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.217130 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-combined-ca-bundle\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.217165 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-config-data\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.217183 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-logs\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.346386 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-scripts\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.346516 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-combined-ca-bundle\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.346582 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-config-data\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.346612 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-logs\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.346683 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-public-tls-certs\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.346773 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqwwb\" (UniqueName: \"kubernetes.io/projected/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-kube-api-access-vqwwb\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.346811 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-internal-tls-certs\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.348712 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-logs\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.353960 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-combined-ca-bundle\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.354061 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-scripts\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.355241 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-public-tls-certs\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.357055 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-config-data\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.357523 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-internal-tls-certs\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.369359 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqwwb\" (UniqueName: \"kubernetes.io/projected/fc4b5923-eb57-490b-a642-1a56d8a7b9b7-kube-api-access-vqwwb\") pod \"placement-654d496b7d-zrbn7\" (UID: \"fc4b5923-eb57-490b-a642-1a56d8a7b9b7\") " pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.487825 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.595182 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.690518 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459","Type":"ContainerStarted","Data":"984634bbda65dc8e437d31246e224f4d083fdc5ba9a9bf357e0c3f3dbf791e91"} Mar 18 14:22:48 crc kubenswrapper[4756]: I0318 14:22:48.947941 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-654d496b7d-zrbn7"] Mar 18 14:22:48 crc kubenswrapper[4756]: W0318 14:22:48.949838 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc4b5923_eb57_490b_a642_1a56d8a7b9b7.slice/crio-9e799648522fb0b13d31af3e3bbe0e73edb994e6f22e2653ddbcd718cb67eaa5 WatchSource:0}: Error finding container 9e799648522fb0b13d31af3e3bbe0e73edb994e6f22e2653ddbcd718cb67eaa5: Status 404 returned error can't find the container with id 9e799648522fb0b13d31af3e3bbe0e73edb994e6f22e2653ddbcd718cb67eaa5 Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.011818 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.344379 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5da0e296-5cae-4c4e-9740-3459f59a6e05" path="/var/lib/kubelet/pods/5da0e296-5cae-4c4e-9740-3459f59a6e05/volumes" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.344961 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.349288 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.354714 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.354873 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.355499 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-49gqx" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.358444 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.484075 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssbs7\" (UniqueName: \"kubernetes.io/projected/8d04aabe-9d1b-4e88-9233-b8002e968aa4-kube-api-access-ssbs7\") pod \"openstackclient\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.484470 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8d04aabe-9d1b-4e88-9233-b8002e968aa4-openstack-config\") pod \"openstackclient\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.484782 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d04aabe-9d1b-4e88-9233-b8002e968aa4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.484820 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d04aabe-9d1b-4e88-9233-b8002e968aa4-openstack-config-secret\") pod \"openstackclient\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.586860 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d04aabe-9d1b-4e88-9233-b8002e968aa4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.586910 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d04aabe-9d1b-4e88-9233-b8002e968aa4-openstack-config-secret\") pod \"openstackclient\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.587022 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssbs7\" (UniqueName: \"kubernetes.io/projected/8d04aabe-9d1b-4e88-9233-b8002e968aa4-kube-api-access-ssbs7\") pod \"openstackclient\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.587057 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8d04aabe-9d1b-4e88-9233-b8002e968aa4-openstack-config\") pod \"openstackclient\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.588177 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8d04aabe-9d1b-4e88-9233-b8002e968aa4-openstack-config\") pod \"openstackclient\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.592190 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d04aabe-9d1b-4e88-9233-b8002e968aa4-openstack-config-secret\") pod \"openstackclient\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.597052 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d04aabe-9d1b-4e88-9233-b8002e968aa4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.609771 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssbs7\" (UniqueName: \"kubernetes.io/projected/8d04aabe-9d1b-4e88-9233-b8002e968aa4-kube-api-access-ssbs7\") pod \"openstackclient\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.718871 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-654d496b7d-zrbn7" event={"ID":"fc4b5923-eb57-490b-a642-1a56d8a7b9b7","Type":"ContainerStarted","Data":"bcaf6ec8dfd38dfcd226ec4d3f48f84cb8be2458094a691632932553cf71c3d5"} Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.719895 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-654d496b7d-zrbn7" event={"ID":"fc4b5923-eb57-490b-a642-1a56d8a7b9b7","Type":"ContainerStarted","Data":"38cf87069a963842d9220de1ceeb8abe7fe70188af64db779f6c590632f53210"} Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.719988 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-654d496b7d-zrbn7" event={"ID":"fc4b5923-eb57-490b-a642-1a56d8a7b9b7","Type":"ContainerStarted","Data":"9e799648522fb0b13d31af3e3bbe0e73edb994e6f22e2653ddbcd718cb67eaa5"} Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.720253 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.720331 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.724831 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.727235 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459","Type":"ContainerStarted","Data":"dc58a10256720f85ecc79985639090d2de00d69ed7451735905f7e194e7d28e0"} Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.768075 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-654d496b7d-zrbn7" podStartSLOduration=1.768059421 podStartE2EDuration="1.768059421s" podCreationTimestamp="2026-03-18 14:22:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:49.765567933 +0000 UTC m=+1371.079985898" watchObservedRunningTime="2026-03-18 14:22:49.768059421 +0000 UTC m=+1371.082477396" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.833415 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.833395997 podStartE2EDuration="2.833395997s" podCreationTimestamp="2026-03-18 14:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:49.817475997 +0000 UTC m=+1371.131893972" watchObservedRunningTime="2026-03-18 14:22:49.833395997 +0000 UTC m=+1371.147813972" Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.982998 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 14:22:49 crc kubenswrapper[4756]: I0318 14:22:49.996004 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.070253 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.073606 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.204216 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/26157f10-41ab-4c9e-836d-136febf288cd-openstack-config-secret\") pod \"openstackclient\" (UID: \"26157f10-41ab-4c9e-836d-136febf288cd\") " pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.204329 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/26157f10-41ab-4c9e-836d-136febf288cd-openstack-config\") pod \"openstackclient\" (UID: \"26157f10-41ab-4c9e-836d-136febf288cd\") " pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.204383 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26157f10-41ab-4c9e-836d-136febf288cd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"26157f10-41ab-4c9e-836d-136febf288cd\") " pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.204453 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ftnz\" (UniqueName: \"kubernetes.io/projected/26157f10-41ab-4c9e-836d-136febf288cd-kube-api-access-6ftnz\") pod \"openstackclient\" (UID: \"26157f10-41ab-4c9e-836d-136febf288cd\") " pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.222180 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.307940 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/26157f10-41ab-4c9e-836d-136febf288cd-openstack-config\") pod \"openstackclient\" (UID: \"26157f10-41ab-4c9e-836d-136febf288cd\") " pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.308029 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26157f10-41ab-4c9e-836d-136febf288cd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"26157f10-41ab-4c9e-836d-136febf288cd\") " pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.308102 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ftnz\" (UniqueName: \"kubernetes.io/projected/26157f10-41ab-4c9e-836d-136febf288cd-kube-api-access-6ftnz\") pod \"openstackclient\" (UID: \"26157f10-41ab-4c9e-836d-136febf288cd\") " pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.308146 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/26157f10-41ab-4c9e-836d-136febf288cd-openstack-config-secret\") pod \"openstackclient\" (UID: \"26157f10-41ab-4c9e-836d-136febf288cd\") " pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.315096 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/26157f10-41ab-4c9e-836d-136febf288cd-openstack-config\") pod \"openstackclient\" (UID: \"26157f10-41ab-4c9e-836d-136febf288cd\") " pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.333422 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/26157f10-41ab-4c9e-836d-136febf288cd-openstack-config-secret\") pod \"openstackclient\" (UID: \"26157f10-41ab-4c9e-836d-136febf288cd\") " pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.346058 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26157f10-41ab-4c9e-836d-136febf288cd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"26157f10-41ab-4c9e-836d-136febf288cd\") " pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.362695 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ftnz\" (UniqueName: \"kubernetes.io/projected/26157f10-41ab-4c9e-836d-136febf288cd-kube-api-access-6ftnz\") pod \"openstackclient\" (UID: \"26157f10-41ab-4c9e-836d-136febf288cd\") " pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: I0318 14:22:50.443559 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 14:22:50 crc kubenswrapper[4756]: E0318 14:22:50.853093 4756 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 14:22:50 crc kubenswrapper[4756]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_8d04aabe-9d1b-4e88-9233-b8002e968aa4_0(c6ef02d6f4c20c11a00905371c62e17c2834173266450a05f215abe07ca81628): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c6ef02d6f4c20c11a00905371c62e17c2834173266450a05f215abe07ca81628" Netns:"/var/run/netns/8ff032a2-19d0-4894-85a9-8aa811a509bb" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=c6ef02d6f4c20c11a00905371c62e17c2834173266450a05f215abe07ca81628;K8S_POD_UID=8d04aabe-9d1b-4e88-9233-b8002e968aa4" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/8d04aabe-9d1b-4e88-9233-b8002e968aa4:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient c6ef02d6f4c20c11a00905371c62e17c2834173266450a05f215abe07ca81628 network default NAD default] [openstack/openstackclient c6ef02d6f4c20c11a00905371c62e17c2834173266450a05f215abe07ca81628 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:c8 [10.217.0.200/23] Mar 18 14:22:50 crc kubenswrapper[4756]: ' Mar 18 14:22:50 crc kubenswrapper[4756]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 14:22:50 crc kubenswrapper[4756]: > Mar 18 14:22:50 crc kubenswrapper[4756]: E0318 14:22:50.853463 4756 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 14:22:50 crc kubenswrapper[4756]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_8d04aabe-9d1b-4e88-9233-b8002e968aa4_0(c6ef02d6f4c20c11a00905371c62e17c2834173266450a05f215abe07ca81628): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c6ef02d6f4c20c11a00905371c62e17c2834173266450a05f215abe07ca81628" Netns:"/var/run/netns/8ff032a2-19d0-4894-85a9-8aa811a509bb" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=c6ef02d6f4c20c11a00905371c62e17c2834173266450a05f215abe07ca81628;K8S_POD_UID=8d04aabe-9d1b-4e88-9233-b8002e968aa4" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/8d04aabe-9d1b-4e88-9233-b8002e968aa4:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient c6ef02d6f4c20c11a00905371c62e17c2834173266450a05f215abe07ca81628 network default NAD default] [openstack/openstackclient c6ef02d6f4c20c11a00905371c62e17c2834173266450a05f215abe07ca81628 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:c8 [10.217.0.200/23] Mar 18 14:22:50 crc kubenswrapper[4756]: ' Mar 18 14:22:50 crc kubenswrapper[4756]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 14:22:50 crc kubenswrapper[4756]: > pod="openstack/openstackclient" Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.343734 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 14:22:51 crc kubenswrapper[4756]: W0318 14:22:51.346554 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26157f10_41ab_4c9e_836d_136febf288cd.slice/crio-bb18fee06a8dcf98b546afa553490e8bb4e9e1913ef55c323f2a075d94a427d2 WatchSource:0}: Error finding container bb18fee06a8dcf98b546afa553490e8bb4e9e1913ef55c323f2a075d94a427d2: Status 404 returned error can't find the container with id bb18fee06a8dcf98b546afa553490e8bb4e9e1913ef55c323f2a075d94a427d2 Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.761259 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"26157f10-41ab-4c9e-836d-136febf288cd","Type":"ContainerStarted","Data":"bb18fee06a8dcf98b546afa553490e8bb4e9e1913ef55c323f2a075d94a427d2"} Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.761294 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.772268 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.774964 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8d04aabe-9d1b-4e88-9233-b8002e968aa4" podUID="26157f10-41ab-4c9e-836d-136febf288cd" Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.846441 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssbs7\" (UniqueName: \"kubernetes.io/projected/8d04aabe-9d1b-4e88-9233-b8002e968aa4-kube-api-access-ssbs7\") pod \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.846531 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8d04aabe-9d1b-4e88-9233-b8002e968aa4-openstack-config\") pod \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.846645 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d04aabe-9d1b-4e88-9233-b8002e968aa4-openstack-config-secret\") pod \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.846714 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d04aabe-9d1b-4e88-9233-b8002e968aa4-combined-ca-bundle\") pod \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\" (UID: \"8d04aabe-9d1b-4e88-9233-b8002e968aa4\") " Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.847196 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d04aabe-9d1b-4e88-9233-b8002e968aa4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8d04aabe-9d1b-4e88-9233-b8002e968aa4" (UID: "8d04aabe-9d1b-4e88-9233-b8002e968aa4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.853045 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d04aabe-9d1b-4e88-9233-b8002e968aa4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8d04aabe-9d1b-4e88-9233-b8002e968aa4" (UID: "8d04aabe-9d1b-4e88-9233-b8002e968aa4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.858770 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d04aabe-9d1b-4e88-9233-b8002e968aa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d04aabe-9d1b-4e88-9233-b8002e968aa4" (UID: "8d04aabe-9d1b-4e88-9233-b8002e968aa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.866385 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d04aabe-9d1b-4e88-9233-b8002e968aa4-kube-api-access-ssbs7" (OuterVolumeSpecName: "kube-api-access-ssbs7") pod "8d04aabe-9d1b-4e88-9233-b8002e968aa4" (UID: "8d04aabe-9d1b-4e88-9233-b8002e968aa4"). InnerVolumeSpecName "kube-api-access-ssbs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.949337 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssbs7\" (UniqueName: \"kubernetes.io/projected/8d04aabe-9d1b-4e88-9233-b8002e968aa4-kube-api-access-ssbs7\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.949595 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8d04aabe-9d1b-4e88-9233-b8002e968aa4-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.949609 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d04aabe-9d1b-4e88-9233-b8002e968aa4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:51 crc kubenswrapper[4756]: I0318 14:22:51.949618 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d04aabe-9d1b-4e88-9233-b8002e968aa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:52 crc kubenswrapper[4756]: I0318 14:22:52.769669 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 14:22:52 crc kubenswrapper[4756]: I0318 14:22:52.788694 4756 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8d04aabe-9d1b-4e88-9233-b8002e968aa4" podUID="26157f10-41ab-4c9e-836d-136febf288cd" Mar 18 14:22:53 crc kubenswrapper[4756]: I0318 14:22:53.327094 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d04aabe-9d1b-4e88-9233-b8002e968aa4" path="/var/lib/kubelet/pods/8d04aabe-9d1b-4e88-9233-b8002e968aa4/volumes" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.224590 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7454cc5499-pkq5t"] Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.237161 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7454cc5499-pkq5t"] Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.237264 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.242932 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.243290 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.246344 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.305490 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.405323 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-combined-ca-bundle\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.405400 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-run-httpd\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.405506 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-internal-tls-certs\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.405600 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-public-tls-certs\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.406001 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-etc-swift\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.406074 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4tmg\" (UniqueName: \"kubernetes.io/projected/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-kube-api-access-x4tmg\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.406175 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-config-data\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.406536 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-log-httpd\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.508907 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-combined-ca-bundle\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.508965 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-run-httpd\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.509009 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-internal-tls-certs\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.509071 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-public-tls-certs\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.509143 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-etc-swift\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.509178 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4tmg\" (UniqueName: \"kubernetes.io/projected/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-kube-api-access-x4tmg\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.509222 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-config-data\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.509256 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-log-httpd\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.509519 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-run-httpd\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.509666 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-log-httpd\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.516225 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-etc-swift\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.517832 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-public-tls-certs\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.517838 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-config-data\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.519883 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-internal-tls-certs\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.520627 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-combined-ca-bundle\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.527317 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4tmg\" (UniqueName: \"kubernetes.io/projected/d3a58bb2-2a9a-4867-a60d-8ea354621ff6-kube-api-access-x4tmg\") pod \"swift-proxy-7454cc5499-pkq5t\" (UID: \"d3a58bb2-2a9a-4867-a60d-8ea354621ff6\") " pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:54 crc kubenswrapper[4756]: I0318 14:22:54.564355 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:55 crc kubenswrapper[4756]: W0318 14:22:55.129760 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3a58bb2_2a9a_4867_a60d_8ea354621ff6.slice/crio-d1755af754a8b183f954856c11b139c3fa153e6540ed83aaa87b38198c377474 WatchSource:0}: Error finding container d1755af754a8b183f954856c11b139c3fa153e6540ed83aaa87b38198c377474: Status 404 returned error can't find the container with id d1755af754a8b183f954856c11b139c3fa153e6540ed83aaa87b38198c377474 Mar 18 14:22:55 crc kubenswrapper[4756]: I0318 14:22:55.135766 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7454cc5499-pkq5t"] Mar 18 14:22:55 crc kubenswrapper[4756]: I0318 14:22:55.800376 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7454cc5499-pkq5t" event={"ID":"d3a58bb2-2a9a-4867-a60d-8ea354621ff6","Type":"ContainerStarted","Data":"cc81ea3c56dab9b38dff132492b161ef6cdd4e8127df032837b9de1f2b6c264f"} Mar 18 14:22:55 crc kubenswrapper[4756]: I0318 14:22:55.800719 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:55 crc kubenswrapper[4756]: I0318 14:22:55.800731 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:22:55 crc kubenswrapper[4756]: I0318 14:22:55.800739 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7454cc5499-pkq5t" event={"ID":"d3a58bb2-2a9a-4867-a60d-8ea354621ff6","Type":"ContainerStarted","Data":"c1d7b20fa290442238f273026d8879a3a41adae9f2ba9438fd8517a1ab38f619"} Mar 18 14:22:55 crc kubenswrapper[4756]: I0318 14:22:55.800748 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7454cc5499-pkq5t" event={"ID":"d3a58bb2-2a9a-4867-a60d-8ea354621ff6","Type":"ContainerStarted","Data":"d1755af754a8b183f954856c11b139c3fa153e6540ed83aaa87b38198c377474"} Mar 18 14:22:55 crc kubenswrapper[4756]: I0318 14:22:55.828270 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7454cc5499-pkq5t" podStartSLOduration=1.828228142 podStartE2EDuration="1.828228142s" podCreationTimestamp="2026-03-18 14:22:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:22:55.820688759 +0000 UTC m=+1377.135106744" watchObservedRunningTime="2026-03-18 14:22:55.828228142 +0000 UTC m=+1377.142646117" Mar 18 14:22:57 crc kubenswrapper[4756]: I0318 14:22:57.948483 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:22:57 crc kubenswrapper[4756]: I0318 14:22:57.948987 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2f752fe8-a008-4737-873f-0ae42990431f" containerName="glance-log" containerID="cri-o://0b4362bbe23b3f2e2d554dc797b564555b1676795a8f7418c1957f3bec62f4bb" gracePeriod=30 Mar 18 14:22:57 crc kubenswrapper[4756]: I0318 14:22:57.949480 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2f752fe8-a008-4737-873f-0ae42990431f" containerName="glance-httpd" containerID="cri-o://7f8c73a200ab1e9fbc6ec57df49d0fa4daf59882731c743ae02b406a84aa05dd" gracePeriod=30 Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.611149 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-98s5z"] Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.612622 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-98s5z" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.631147 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-98s5z"] Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.692984 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-f9qhz"] Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.693646 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1190e7a3-cde5-467e-adbc-b20c6f7823d5-operator-scripts\") pod \"nova-api-db-create-98s5z\" (UID: \"1190e7a3-cde5-467e-adbc-b20c6f7823d5\") " pod="openstack/nova-api-db-create-98s5z" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.693693 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plbgt\" (UniqueName: \"kubernetes.io/projected/1190e7a3-cde5-467e-adbc-b20c6f7823d5-kube-api-access-plbgt\") pod \"nova-api-db-create-98s5z\" (UID: \"1190e7a3-cde5-467e-adbc-b20c6f7823d5\") " pod="openstack/nova-api-db-create-98s5z" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.694615 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f9qhz" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.738408 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f9qhz"] Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.765318 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-ac52-account-create-update-kxhc4"] Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.766706 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac52-account-create-update-kxhc4" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.769243 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.799088 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c62dc13-d167-48da-909e-eff8ca5852f4-operator-scripts\") pod \"nova-cell0-db-create-f9qhz\" (UID: \"8c62dc13-d167-48da-909e-eff8ca5852f4\") " pod="openstack/nova-cell0-db-create-f9qhz" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.799168 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1190e7a3-cde5-467e-adbc-b20c6f7823d5-operator-scripts\") pod \"nova-api-db-create-98s5z\" (UID: \"1190e7a3-cde5-467e-adbc-b20c6f7823d5\") " pod="openstack/nova-api-db-create-98s5z" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.799192 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plbgt\" (UniqueName: \"kubernetes.io/projected/1190e7a3-cde5-467e-adbc-b20c6f7823d5-kube-api-access-plbgt\") pod \"nova-api-db-create-98s5z\" (UID: \"1190e7a3-cde5-467e-adbc-b20c6f7823d5\") " pod="openstack/nova-api-db-create-98s5z" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.799223 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wntc4\" (UniqueName: \"kubernetes.io/projected/8c62dc13-d167-48da-909e-eff8ca5852f4-kube-api-access-wntc4\") pod \"nova-cell0-db-create-f9qhz\" (UID: \"8c62dc13-d167-48da-909e-eff8ca5852f4\") " pod="openstack/nova-cell0-db-create-f9qhz" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.800012 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1190e7a3-cde5-467e-adbc-b20c6f7823d5-operator-scripts\") pod \"nova-api-db-create-98s5z\" (UID: \"1190e7a3-cde5-467e-adbc-b20c6f7823d5\") " pod="openstack/nova-api-db-create-98s5z" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.811881 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ac52-account-create-update-kxhc4"] Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.840424 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plbgt\" (UniqueName: \"kubernetes.io/projected/1190e7a3-cde5-467e-adbc-b20c6f7823d5-kube-api-access-plbgt\") pod \"nova-api-db-create-98s5z\" (UID: \"1190e7a3-cde5-467e-adbc-b20c6f7823d5\") " pod="openstack/nova-api-db-create-98s5z" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.903460 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c62dc13-d167-48da-909e-eff8ca5852f4-operator-scripts\") pod \"nova-cell0-db-create-f9qhz\" (UID: \"8c62dc13-d167-48da-909e-eff8ca5852f4\") " pod="openstack/nova-cell0-db-create-f9qhz" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.903540 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aabe4a2-969f-44f7-aa29-60db47845f80-operator-scripts\") pod \"nova-api-ac52-account-create-update-kxhc4\" (UID: \"5aabe4a2-969f-44f7-aa29-60db47845f80\") " pod="openstack/nova-api-ac52-account-create-update-kxhc4" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.903571 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wntc4\" (UniqueName: \"kubernetes.io/projected/8c62dc13-d167-48da-909e-eff8ca5852f4-kube-api-access-wntc4\") pod \"nova-cell0-db-create-f9qhz\" (UID: \"8c62dc13-d167-48da-909e-eff8ca5852f4\") " pod="openstack/nova-cell0-db-create-f9qhz" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.903619 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktcj5\" (UniqueName: \"kubernetes.io/projected/5aabe4a2-969f-44f7-aa29-60db47845f80-kube-api-access-ktcj5\") pod \"nova-api-ac52-account-create-update-kxhc4\" (UID: \"5aabe4a2-969f-44f7-aa29-60db47845f80\") " pod="openstack/nova-api-ac52-account-create-update-kxhc4" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.904400 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c62dc13-d167-48da-909e-eff8ca5852f4-operator-scripts\") pod \"nova-cell0-db-create-f9qhz\" (UID: \"8c62dc13-d167-48da-909e-eff8ca5852f4\") " pod="openstack/nova-cell0-db-create-f9qhz" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.913830 4756 generic.go:334] "Generic (PLEG): container finished" podID="2f752fe8-a008-4737-873f-0ae42990431f" containerID="0b4362bbe23b3f2e2d554dc797b564555b1676795a8f7418c1957f3bec62f4bb" exitCode=143 Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.913918 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f752fe8-a008-4737-873f-0ae42990431f","Type":"ContainerDied","Data":"0b4362bbe23b3f2e2d554dc797b564555b1676795a8f7418c1957f3bec62f4bb"} Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.918931 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9614-account-create-update-4784w"] Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.924971 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9614-account-create-update-4784w" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.931764 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.933396 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wntc4\" (UniqueName: \"kubernetes.io/projected/8c62dc13-d167-48da-909e-eff8ca5852f4-kube-api-access-wntc4\") pod \"nova-cell0-db-create-f9qhz\" (UID: \"8c62dc13-d167-48da-909e-eff8ca5852f4\") " pod="openstack/nova-cell0-db-create-f9qhz" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.942171 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-98s5z" Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.942968 4756 generic.go:334] "Generic (PLEG): container finished" podID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerID="2c9cadb470d69e4ba22f022cd3177e4dd2e4e7426da004bcf5d2250205c1d4f1" exitCode=137 Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.943000 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a791a6a-0e50-465a-90cf-e4af5bdc12de","Type":"ContainerDied","Data":"2c9cadb470d69e4ba22f022cd3177e4dd2e4e7426da004bcf5d2250205c1d4f1"} Mar 18 14:22:58 crc kubenswrapper[4756]: I0318 14:22:58.968179 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9614-account-create-update-4784w"] Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.005634 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aabe4a2-969f-44f7-aa29-60db47845f80-operator-scripts\") pod \"nova-api-ac52-account-create-update-kxhc4\" (UID: \"5aabe4a2-969f-44f7-aa29-60db47845f80\") " pod="openstack/nova-api-ac52-account-create-update-kxhc4" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.005724 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktcj5\" (UniqueName: \"kubernetes.io/projected/5aabe4a2-969f-44f7-aa29-60db47845f80-kube-api-access-ktcj5\") pod \"nova-api-ac52-account-create-update-kxhc4\" (UID: \"5aabe4a2-969f-44f7-aa29-60db47845f80\") " pod="openstack/nova-api-ac52-account-create-update-kxhc4" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.007449 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aabe4a2-969f-44f7-aa29-60db47845f80-operator-scripts\") pod \"nova-api-ac52-account-create-update-kxhc4\" (UID: \"5aabe4a2-969f-44f7-aa29-60db47845f80\") " pod="openstack/nova-api-ac52-account-create-update-kxhc4" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.018510 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-f7zgs"] Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.019965 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f7zgs" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.034282 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f9qhz" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.040662 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktcj5\" (UniqueName: \"kubernetes.io/projected/5aabe4a2-969f-44f7-aa29-60db47845f80-kube-api-access-ktcj5\") pod \"nova-api-ac52-account-create-update-kxhc4\" (UID: \"5aabe4a2-969f-44f7-aa29-60db47845f80\") " pod="openstack/nova-api-ac52-account-create-update-kxhc4" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.059236 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-f7zgs"] Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.108164 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/928231e9-14ab-4f87-85c0-37a372d3ed9d-operator-scripts\") pod \"nova-cell1-db-create-f7zgs\" (UID: \"928231e9-14ab-4f87-85c0-37a372d3ed9d\") " pod="openstack/nova-cell1-db-create-f7zgs" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.108228 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwwjn\" (UniqueName: \"kubernetes.io/projected/928231e9-14ab-4f87-85c0-37a372d3ed9d-kube-api-access-mwwjn\") pod \"nova-cell1-db-create-f7zgs\" (UID: \"928231e9-14ab-4f87-85c0-37a372d3ed9d\") " pod="openstack/nova-cell1-db-create-f7zgs" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.108250 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9hgx\" (UniqueName: \"kubernetes.io/projected/dc4adb0e-915e-4b69-ad6f-2f510f53e2e8-kube-api-access-z9hgx\") pod \"nova-cell0-9614-account-create-update-4784w\" (UID: \"dc4adb0e-915e-4b69-ad6f-2f510f53e2e8\") " pod="openstack/nova-cell0-9614-account-create-update-4784w" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.108307 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4adb0e-915e-4b69-ad6f-2f510f53e2e8-operator-scripts\") pod \"nova-cell0-9614-account-create-update-4784w\" (UID: \"dc4adb0e-915e-4b69-ad6f-2f510f53e2e8\") " pod="openstack/nova-cell0-9614-account-create-update-4784w" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.113782 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac52-account-create-update-kxhc4" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.125248 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1f8d-account-create-update-47b8t"] Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.127266 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f8d-account-create-update-47b8t" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.132235 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.136175 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1f8d-account-create-update-47b8t"] Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.209993 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwwjn\" (UniqueName: \"kubernetes.io/projected/928231e9-14ab-4f87-85c0-37a372d3ed9d-kube-api-access-mwwjn\") pod \"nova-cell1-db-create-f7zgs\" (UID: \"928231e9-14ab-4f87-85c0-37a372d3ed9d\") " pod="openstack/nova-cell1-db-create-f7zgs" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.210034 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9hgx\" (UniqueName: \"kubernetes.io/projected/dc4adb0e-915e-4b69-ad6f-2f510f53e2e8-kube-api-access-z9hgx\") pod \"nova-cell0-9614-account-create-update-4784w\" (UID: \"dc4adb0e-915e-4b69-ad6f-2f510f53e2e8\") " pod="openstack/nova-cell0-9614-account-create-update-4784w" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.210079 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4adb0e-915e-4b69-ad6f-2f510f53e2e8-operator-scripts\") pod \"nova-cell0-9614-account-create-update-4784w\" (UID: \"dc4adb0e-915e-4b69-ad6f-2f510f53e2e8\") " pod="openstack/nova-cell0-9614-account-create-update-4784w" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.210105 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd8dh\" (UniqueName: \"kubernetes.io/projected/3d1ea486-ed86-4ae5-9374-3538e9d1e4fc-kube-api-access-cd8dh\") pod \"nova-cell1-1f8d-account-create-update-47b8t\" (UID: \"3d1ea486-ed86-4ae5-9374-3538e9d1e4fc\") " pod="openstack/nova-cell1-1f8d-account-create-update-47b8t" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.210256 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1ea486-ed86-4ae5-9374-3538e9d1e4fc-operator-scripts\") pod \"nova-cell1-1f8d-account-create-update-47b8t\" (UID: \"3d1ea486-ed86-4ae5-9374-3538e9d1e4fc\") " pod="openstack/nova-cell1-1f8d-account-create-update-47b8t" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.210276 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/928231e9-14ab-4f87-85c0-37a372d3ed9d-operator-scripts\") pod \"nova-cell1-db-create-f7zgs\" (UID: \"928231e9-14ab-4f87-85c0-37a372d3ed9d\") " pod="openstack/nova-cell1-db-create-f7zgs" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.211264 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4adb0e-915e-4b69-ad6f-2f510f53e2e8-operator-scripts\") pod \"nova-cell0-9614-account-create-update-4784w\" (UID: \"dc4adb0e-915e-4b69-ad6f-2f510f53e2e8\") " pod="openstack/nova-cell0-9614-account-create-update-4784w" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.211305 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/928231e9-14ab-4f87-85c0-37a372d3ed9d-operator-scripts\") pod \"nova-cell1-db-create-f7zgs\" (UID: \"928231e9-14ab-4f87-85c0-37a372d3ed9d\") " pod="openstack/nova-cell1-db-create-f7zgs" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.227715 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwwjn\" (UniqueName: \"kubernetes.io/projected/928231e9-14ab-4f87-85c0-37a372d3ed9d-kube-api-access-mwwjn\") pod \"nova-cell1-db-create-f7zgs\" (UID: \"928231e9-14ab-4f87-85c0-37a372d3ed9d\") " pod="openstack/nova-cell1-db-create-f7zgs" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.228322 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9hgx\" (UniqueName: \"kubernetes.io/projected/dc4adb0e-915e-4b69-ad6f-2f510f53e2e8-kube-api-access-z9hgx\") pod \"nova-cell0-9614-account-create-update-4784w\" (UID: \"dc4adb0e-915e-4b69-ad6f-2f510f53e2e8\") " pod="openstack/nova-cell0-9614-account-create-update-4784w" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.294203 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9614-account-create-update-4784w" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.314101 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1ea486-ed86-4ae5-9374-3538e9d1e4fc-operator-scripts\") pod \"nova-cell1-1f8d-account-create-update-47b8t\" (UID: \"3d1ea486-ed86-4ae5-9374-3538e9d1e4fc\") " pod="openstack/nova-cell1-1f8d-account-create-update-47b8t" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.314498 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd8dh\" (UniqueName: \"kubernetes.io/projected/3d1ea486-ed86-4ae5-9374-3538e9d1e4fc-kube-api-access-cd8dh\") pod \"nova-cell1-1f8d-account-create-update-47b8t\" (UID: \"3d1ea486-ed86-4ae5-9374-3538e9d1e4fc\") " pod="openstack/nova-cell1-1f8d-account-create-update-47b8t" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.314709 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1ea486-ed86-4ae5-9374-3538e9d1e4fc-operator-scripts\") pod \"nova-cell1-1f8d-account-create-update-47b8t\" (UID: \"3d1ea486-ed86-4ae5-9374-3538e9d1e4fc\") " pod="openstack/nova-cell1-1f8d-account-create-update-47b8t" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.335795 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd8dh\" (UniqueName: \"kubernetes.io/projected/3d1ea486-ed86-4ae5-9374-3538e9d1e4fc-kube-api-access-cd8dh\") pod \"nova-cell1-1f8d-account-create-update-47b8t\" (UID: \"3d1ea486-ed86-4ae5-9374-3538e9d1e4fc\") " pod="openstack/nova-cell1-1f8d-account-create-update-47b8t" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.423410 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f7zgs" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.450419 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f8d-account-create-update-47b8t" Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.564732 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.564978 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="660c0c11-4c3e-465d-a8e8-181dfda9f400" containerName="glance-log" containerID="cri-o://566c759a70dbd0417353cd55e43283fb0045901dc489f48141cc19b958c7f10a" gracePeriod=30 Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.565073 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="660c0c11-4c3e-465d-a8e8-181dfda9f400" containerName="glance-httpd" containerID="cri-o://b434d48f08622062d240856c8ad009c0feff9187d0d10a7a8919ac644f65115a" gracePeriod=30 Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.957307 4756 generic.go:334] "Generic (PLEG): container finished" podID="660c0c11-4c3e-465d-a8e8-181dfda9f400" containerID="566c759a70dbd0417353cd55e43283fb0045901dc489f48141cc19b958c7f10a" exitCode=143 Mar 18 14:22:59 crc kubenswrapper[4756]: I0318 14:22:59.957541 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"660c0c11-4c3e-465d-a8e8-181dfda9f400","Type":"ContainerDied","Data":"566c759a70dbd0417353cd55e43283fb0045901dc489f48141cc19b958c7f10a"} Mar 18 14:23:00 crc kubenswrapper[4756]: I0318 14:23:00.709639 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.168:3000/\": dial tcp 10.217.0.168:3000: connect: connection refused" Mar 18 14:23:01 crc kubenswrapper[4756]: I0318 14:23:01.152891 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-766db9c4f-fbsb4" Mar 18 14:23:01 crc kubenswrapper[4756]: I0318 14:23:01.224518 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56db767868-svvqr"] Mar 18 14:23:01 crc kubenswrapper[4756]: I0318 14:23:01.224773 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56db767868-svvqr" podUID="e7ade8da-71b8-402b-8b7c-d2b333ab31da" containerName="neutron-api" containerID="cri-o://720b57841e6438cde31ac56c82d9321f65025de2d66a3d41576fa3ab95840aaa" gracePeriod=30 Mar 18 14:23:01 crc kubenswrapper[4756]: I0318 14:23:01.225186 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56db767868-svvqr" podUID="e7ade8da-71b8-402b-8b7c-d2b333ab31da" containerName="neutron-httpd" containerID="cri-o://c81c552cccbd366bde29189352ac386f636c21f02ae547e20dd8d50594724edd" gracePeriod=30 Mar 18 14:23:01 crc kubenswrapper[4756]: I0318 14:23:01.980744 4756 generic.go:334] "Generic (PLEG): container finished" podID="e7ade8da-71b8-402b-8b7c-d2b333ab31da" containerID="c81c552cccbd366bde29189352ac386f636c21f02ae547e20dd8d50594724edd" exitCode=0 Mar 18 14:23:01 crc kubenswrapper[4756]: I0318 14:23:01.981481 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56db767868-svvqr" event={"ID":"e7ade8da-71b8-402b-8b7c-d2b333ab31da","Type":"ContainerDied","Data":"c81c552cccbd366bde29189352ac386f636c21f02ae547e20dd8d50594724edd"} Mar 18 14:23:01 crc kubenswrapper[4756]: I0318 14:23:01.983636 4756 generic.go:334] "Generic (PLEG): container finished" podID="2f752fe8-a008-4737-873f-0ae42990431f" containerID="7f8c73a200ab1e9fbc6ec57df49d0fa4daf59882731c743ae02b406a84aa05dd" exitCode=0 Mar 18 14:23:01 crc kubenswrapper[4756]: I0318 14:23:01.983715 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f752fe8-a008-4737-873f-0ae42990431f","Type":"ContainerDied","Data":"7f8c73a200ab1e9fbc6ec57df49d0fa4daf59882731c743ae02b406a84aa05dd"} Mar 18 14:23:01 crc kubenswrapper[4756]: I0318 14:23:01.985975 4756 generic.go:334] "Generic (PLEG): container finished" podID="f3ac7084-50e8-406f-90f5-2cf4d2350935" containerID="aa31d4b7a655bec3dc2cc33e5a9a40a1fb83f5ff25db2e115ddea543f06c9dc7" exitCode=137 Mar 18 14:23:01 crc kubenswrapper[4756]: I0318 14:23:01.985999 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f3ac7084-50e8-406f-90f5-2cf4d2350935","Type":"ContainerDied","Data":"aa31d4b7a655bec3dc2cc33e5a9a40a1fb83f5ff25db2e115ddea543f06c9dc7"} Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.009976 4756 generic.go:334] "Generic (PLEG): container finished" podID="660c0c11-4c3e-465d-a8e8-181dfda9f400" containerID="b434d48f08622062d240856c8ad009c0feff9187d0d10a7a8919ac644f65115a" exitCode=0 Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.010505 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"660c0c11-4c3e-465d-a8e8-181dfda9f400","Type":"ContainerDied","Data":"b434d48f08622062d240856c8ad009c0feff9187d0d10a7a8919ac644f65115a"} Mar 18 14:23:03 crc kubenswrapper[4756]: E0318 14:23:03.150269 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod660c0c11_4c3e_465d_a8e8_181dfda9f400.slice/crio-b434d48f08622062d240856c8ad009c0feff9187d0d10a7a8919ac644f65115a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod660c0c11_4c3e_465d_a8e8_181dfda9f400.slice/crio-conmon-b434d48f08622062d240856c8ad009c0feff9187d0d10a7a8919ac644f65115a.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.240317 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.300352 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.411137 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a791a6a-0e50-465a-90cf-e4af5bdc12de-run-httpd\") pod \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.412393 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-scripts\") pod \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.412567 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-combined-ca-bundle\") pod \"f3ac7084-50e8-406f-90f5-2cf4d2350935\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.412636 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-scripts\") pod \"f3ac7084-50e8-406f-90f5-2cf4d2350935\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.412751 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f3ac7084-50e8-406f-90f5-2cf4d2350935-etc-machine-id\") pod \"f3ac7084-50e8-406f-90f5-2cf4d2350935\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.412846 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-config-data-custom\") pod \"f3ac7084-50e8-406f-90f5-2cf4d2350935\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.412926 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-sg-core-conf-yaml\") pod \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.413008 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a791a6a-0e50-465a-90cf-e4af5bdc12de-log-httpd\") pod \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.413083 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-config-data\") pod \"f3ac7084-50e8-406f-90f5-2cf4d2350935\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.413189 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3ac7084-50e8-406f-90f5-2cf4d2350935-logs\") pod \"f3ac7084-50e8-406f-90f5-2cf4d2350935\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.413304 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-config-data\") pod \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.413378 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdlkz\" (UniqueName: \"kubernetes.io/projected/f3ac7084-50e8-406f-90f5-2cf4d2350935-kube-api-access-pdlkz\") pod \"f3ac7084-50e8-406f-90f5-2cf4d2350935\" (UID: \"f3ac7084-50e8-406f-90f5-2cf4d2350935\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.413478 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-combined-ca-bundle\") pod \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.413583 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql966\" (UniqueName: \"kubernetes.io/projected/4a791a6a-0e50-465a-90cf-e4af5bdc12de-kube-api-access-ql966\") pod \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\" (UID: \"4a791a6a-0e50-465a-90cf-e4af5bdc12de\") " Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.412302 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a791a6a-0e50-465a-90cf-e4af5bdc12de-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4a791a6a-0e50-465a-90cf-e4af5bdc12de" (UID: "4a791a6a-0e50-465a-90cf-e4af5bdc12de"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.414991 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3ac7084-50e8-406f-90f5-2cf4d2350935-logs" (OuterVolumeSpecName: "logs") pod "f3ac7084-50e8-406f-90f5-2cf4d2350935" (UID: "f3ac7084-50e8-406f-90f5-2cf4d2350935"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.415036 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3ac7084-50e8-406f-90f5-2cf4d2350935-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f3ac7084-50e8-406f-90f5-2cf4d2350935" (UID: "f3ac7084-50e8-406f-90f5-2cf4d2350935"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.419751 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a791a6a-0e50-465a-90cf-e4af5bdc12de-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4a791a6a-0e50-465a-90cf-e4af5bdc12de" (UID: "4a791a6a-0e50-465a-90cf-e4af5bdc12de"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.431411 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-scripts" (OuterVolumeSpecName: "scripts") pod "f3ac7084-50e8-406f-90f5-2cf4d2350935" (UID: "f3ac7084-50e8-406f-90f5-2cf4d2350935"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.431574 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f3ac7084-50e8-406f-90f5-2cf4d2350935" (UID: "f3ac7084-50e8-406f-90f5-2cf4d2350935"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.431607 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ac7084-50e8-406f-90f5-2cf4d2350935-kube-api-access-pdlkz" (OuterVolumeSpecName: "kube-api-access-pdlkz") pod "f3ac7084-50e8-406f-90f5-2cf4d2350935" (UID: "f3ac7084-50e8-406f-90f5-2cf4d2350935"). InnerVolumeSpecName "kube-api-access-pdlkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.431719 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-scripts" (OuterVolumeSpecName: "scripts") pod "4a791a6a-0e50-465a-90cf-e4af5bdc12de" (UID: "4a791a6a-0e50-465a-90cf-e4af5bdc12de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.457806 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a791a6a-0e50-465a-90cf-e4af5bdc12de-kube-api-access-ql966" (OuterVolumeSpecName: "kube-api-access-ql966") pod "4a791a6a-0e50-465a-90cf-e4af5bdc12de" (UID: "4a791a6a-0e50-465a-90cf-e4af5bdc12de"). InnerVolumeSpecName "kube-api-access-ql966". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.487030 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4a791a6a-0e50-465a-90cf-e4af5bdc12de" (UID: "4a791a6a-0e50-465a-90cf-e4af5bdc12de"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.501455 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3ac7084-50e8-406f-90f5-2cf4d2350935" (UID: "f3ac7084-50e8-406f-90f5-2cf4d2350935"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.524076 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.524482 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.524497 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.524505 4756 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f3ac7084-50e8-406f-90f5-2cf4d2350935-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.524514 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.524544 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.524552 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a791a6a-0e50-465a-90cf-e4af5bdc12de-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.524561 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3ac7084-50e8-406f-90f5-2cf4d2350935-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.524571 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdlkz\" (UniqueName: \"kubernetes.io/projected/f3ac7084-50e8-406f-90f5-2cf4d2350935-kube-api-access-pdlkz\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.524582 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql966\" (UniqueName: \"kubernetes.io/projected/4a791a6a-0e50-465a-90cf-e4af5bdc12de-kube-api-access-ql966\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.524590 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a791a6a-0e50-465a-90cf-e4af5bdc12de-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.537054 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-config-data" (OuterVolumeSpecName: "config-data") pod "f3ac7084-50e8-406f-90f5-2cf4d2350935" (UID: "f3ac7084-50e8-406f-90f5-2cf4d2350935"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.575024 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a791a6a-0e50-465a-90cf-e4af5bdc12de" (UID: "4a791a6a-0e50-465a-90cf-e4af5bdc12de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.626432 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ac7084-50e8-406f-90f5-2cf4d2350935-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.626462 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.637029 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-config-data" (OuterVolumeSpecName: "config-data") pod "4a791a6a-0e50-465a-90cf-e4af5bdc12de" (UID: "4a791a6a-0e50-465a-90cf-e4af5bdc12de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.728855 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a791a6a-0e50-465a-90cf-e4af5bdc12de-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:03 crc kubenswrapper[4756]: I0318 14:23:03.894231 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.021220 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.024354 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"26157f10-41ab-4c9e-836d-136febf288cd","Type":"ContainerStarted","Data":"7f8f58360da318b86683c2a036829e3ac4173669a506ac366f7a886759ee3a45"} Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.031253 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"660c0c11-4c3e-465d-a8e8-181dfda9f400","Type":"ContainerDied","Data":"19bc1c1c80f57b5ee27ce248eaf7ca8197002ae2e1830e0fd0a1e305aa00966d"} Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.031307 4756 scope.go:117] "RemoveContainer" containerID="b434d48f08622062d240856c8ad009c0feff9187d0d10a7a8919ac644f65115a" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.031273 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.032504 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f752fe8-a008-4737-873f-0ae42990431f-logs\") pod \"2f752fe8-a008-4737-873f-0ae42990431f\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.032553 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-public-tls-certs\") pod \"2f752fe8-a008-4737-873f-0ae42990431f\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.032605 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-combined-ca-bundle\") pod \"2f752fe8-a008-4737-873f-0ae42990431f\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.032637 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-scripts\") pod \"2f752fe8-a008-4737-873f-0ae42990431f\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.032683 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f752fe8-a008-4737-873f-0ae42990431f-httpd-run\") pod \"2f752fe8-a008-4737-873f-0ae42990431f\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.032845 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"2f752fe8-a008-4737-873f-0ae42990431f\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.032879 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bqct\" (UniqueName: \"kubernetes.io/projected/2f752fe8-a008-4737-873f-0ae42990431f-kube-api-access-8bqct\") pod \"2f752fe8-a008-4737-873f-0ae42990431f\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.033032 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-config-data\") pod \"2f752fe8-a008-4737-873f-0ae42990431f\" (UID: \"2f752fe8-a008-4737-873f-0ae42990431f\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.033822 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f752fe8-a008-4737-873f-0ae42990431f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2f752fe8-a008-4737-873f-0ae42990431f" (UID: "2f752fe8-a008-4737-873f-0ae42990431f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.034100 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f752fe8-a008-4737-873f-0ae42990431f-logs" (OuterVolumeSpecName: "logs") pod "2f752fe8-a008-4737-873f-0ae42990431f" (UID: "2f752fe8-a008-4737-873f-0ae42990431f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.041168 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f752fe8-a008-4737-873f-0ae42990431f-kube-api-access-8bqct" (OuterVolumeSpecName: "kube-api-access-8bqct") pod "2f752fe8-a008-4737-873f-0ae42990431f" (UID: "2f752fe8-a008-4737-873f-0ae42990431f"). InnerVolumeSpecName "kube-api-access-8bqct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.048485 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-scripts" (OuterVolumeSpecName: "scripts") pod "2f752fe8-a008-4737-873f-0ae42990431f" (UID: "2f752fe8-a008-4737-873f-0ae42990431f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.080933 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f752fe8-a008-4737-873f-0ae42990431f","Type":"ContainerDied","Data":"a957f2c89f4b9b89ccb42396fa6020778c1d86289e6443a7d2175d0f1fbe4911"} Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.081058 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.117297 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.462240208 podStartE2EDuration="15.11727499s" podCreationTimestamp="2026-03-18 14:22:49 +0000 UTC" firstStartedPulling="2026-03-18 14:22:51.349019648 +0000 UTC m=+1372.663437623" lastFinishedPulling="2026-03-18 14:23:03.00405444 +0000 UTC m=+1384.318472405" observedRunningTime="2026-03-18 14:23:04.081432161 +0000 UTC m=+1385.395850136" watchObservedRunningTime="2026-03-18 14:23:04.11727499 +0000 UTC m=+1385.431692965" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.142953 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f752fe8-a008-4737-873f-0ae42990431f" (UID: "2f752fe8-a008-4737-873f-0ae42990431f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.143042 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2f752fe8-a008-4737-873f-0ae42990431f" (UID: "2f752fe8-a008-4737-873f-0ae42990431f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.143180 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817" (OuterVolumeSpecName: "glance") pod "2f752fe8-a008-4737-873f-0ae42990431f" (UID: "2f752fe8-a008-4737-873f-0ae42990431f"). InnerVolumeSpecName "pvc-d44f36dc-e387-43e2-913e-de408349f817". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.143885 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-combined-ca-bundle\") pod \"660c0c11-4c3e-465d-a8e8-181dfda9f400\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.143922 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-config-data\") pod \"660c0c11-4c3e-465d-a8e8-181dfda9f400\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.143950 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/660c0c11-4c3e-465d-a8e8-181dfda9f400-logs\") pod \"660c0c11-4c3e-465d-a8e8-181dfda9f400\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.144009 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-internal-tls-certs\") pod \"660c0c11-4c3e-465d-a8e8-181dfda9f400\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.144053 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdz87\" (UniqueName: \"kubernetes.io/projected/660c0c11-4c3e-465d-a8e8-181dfda9f400-kube-api-access-qdz87\") pod \"660c0c11-4c3e-465d-a8e8-181dfda9f400\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.144148 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-scripts\") pod \"660c0c11-4c3e-465d-a8e8-181dfda9f400\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.144186 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/660c0c11-4c3e-465d-a8e8-181dfda9f400-httpd-run\") pod \"660c0c11-4c3e-465d-a8e8-181dfda9f400\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.144344 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"660c0c11-4c3e-465d-a8e8-181dfda9f400\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.144797 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f752fe8-a008-4737-873f-0ae42990431f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.144807 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.144818 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.144826 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.144834 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f752fe8-a008-4737-873f-0ae42990431f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.144866 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") on node \"crc\" " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.144876 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bqct\" (UniqueName: \"kubernetes.io/projected/2f752fe8-a008-4737-873f-0ae42990431f-kube-api-access-8bqct\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.146875 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/660c0c11-4c3e-465d-a8e8-181dfda9f400-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "660c0c11-4c3e-465d-a8e8-181dfda9f400" (UID: "660c0c11-4c3e-465d-a8e8-181dfda9f400"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.148444 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/660c0c11-4c3e-465d-a8e8-181dfda9f400-logs" (OuterVolumeSpecName: "logs") pod "660c0c11-4c3e-465d-a8e8-181dfda9f400" (UID: "660c0c11-4c3e-465d-a8e8-181dfda9f400"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.152293 4756 scope.go:117] "RemoveContainer" containerID="566c759a70dbd0417353cd55e43283fb0045901dc489f48141cc19b958c7f10a" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.156486 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a791a6a-0e50-465a-90cf-e4af5bdc12de","Type":"ContainerDied","Data":"b881e246a0058c4d718cf673d769f0b3b79a465134dd30fe87e792a89dc748ef"} Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.156500 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.156637 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/660c0c11-4c3e-465d-a8e8-181dfda9f400-kube-api-access-qdz87" (OuterVolumeSpecName: "kube-api-access-qdz87") pod "660c0c11-4c3e-465d-a8e8-181dfda9f400" (UID: "660c0c11-4c3e-465d-a8e8-181dfda9f400"). InnerVolumeSpecName "kube-api-access-qdz87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.157600 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-config-data" (OuterVolumeSpecName: "config-data") pod "2f752fe8-a008-4737-873f-0ae42990431f" (UID: "2f752fe8-a008-4737-873f-0ae42990431f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.157750 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-scripts" (OuterVolumeSpecName: "scripts") pod "660c0c11-4c3e-465d-a8e8-181dfda9f400" (UID: "660c0c11-4c3e-465d-a8e8-181dfda9f400"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.168822 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b" (OuterVolumeSpecName: "glance") pod "660c0c11-4c3e-465d-a8e8-181dfda9f400" (UID: "660c0c11-4c3e-465d-a8e8-181dfda9f400"). InnerVolumeSpecName "pvc-565e4be1-2f3b-410a-808c-677fff515f0b". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.177264 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.177550 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f3ac7084-50e8-406f-90f5-2cf4d2350935","Type":"ContainerDied","Data":"2b2d90cb48883ef2efa049ac425da655351111d655aecf504e0729f05013936f"} Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.184714 4756 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.184877 4756 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d44f36dc-e387-43e2-913e-de408349f817" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817") on node "crc" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.199785 4756 scope.go:117] "RemoveContainer" containerID="7f8c73a200ab1e9fbc6ec57df49d0fa4daf59882731c743ae02b406a84aa05dd" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.217450 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "660c0c11-4c3e-465d-a8e8-181dfda9f400" (UID: "660c0c11-4c3e-465d-a8e8-181dfda9f400"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.225853 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.249775 4756 scope.go:117] "RemoveContainer" containerID="0b4362bbe23b3f2e2d554dc797b564555b1676795a8f7418c1957f3bec62f4bb" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.263148 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "660c0c11-4c3e-465d-a8e8-181dfda9f400" (UID: "660c0c11-4c3e-465d-a8e8-181dfda9f400"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.263400 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-internal-tls-certs\") pod \"660c0c11-4c3e-465d-a8e8-181dfda9f400\" (UID: \"660c0c11-4c3e-465d-a8e8-181dfda9f400\") " Mar 18 14:23:04 crc kubenswrapper[4756]: W0318 14:23:04.264203 4756 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/660c0c11-4c3e-465d-a8e8-181dfda9f400/volumes/kubernetes.io~secret/internal-tls-certs Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.264291 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") on node \"crc\" " Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.264316 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.264328 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/660c0c11-4c3e-465d-a8e8-181dfda9f400-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.264340 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f752fe8-a008-4737-873f-0ae42990431f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.264348 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdz87\" (UniqueName: \"kubernetes.io/projected/660c0c11-4c3e-465d-a8e8-181dfda9f400-kube-api-access-qdz87\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.264357 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.264367 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/660c0c11-4c3e-465d-a8e8-181dfda9f400-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.264376 4756 reconciler_common.go:293] "Volume detached for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.264295 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "660c0c11-4c3e-465d-a8e8-181dfda9f400" (UID: "660c0c11-4c3e-465d-a8e8-181dfda9f400"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.269409 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.285732 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: E0318 14:23:04.286183 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f752fe8-a008-4737-873f-0ae42990431f" containerName="glance-log" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286197 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f752fe8-a008-4737-873f-0ae42990431f" containerName="glance-log" Mar 18 14:23:04 crc kubenswrapper[4756]: E0318 14:23:04.286223 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="ceilometer-central-agent" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286230 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="ceilometer-central-agent" Mar 18 14:23:04 crc kubenswrapper[4756]: E0318 14:23:04.286241 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="proxy-httpd" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286247 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="proxy-httpd" Mar 18 14:23:04 crc kubenswrapper[4756]: E0318 14:23:04.286263 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="660c0c11-4c3e-465d-a8e8-181dfda9f400" containerName="glance-log" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286268 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="660c0c11-4c3e-465d-a8e8-181dfda9f400" containerName="glance-log" Mar 18 14:23:04 crc kubenswrapper[4756]: E0318 14:23:04.286279 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ac7084-50e8-406f-90f5-2cf4d2350935" containerName="cinder-api" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286284 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ac7084-50e8-406f-90f5-2cf4d2350935" containerName="cinder-api" Mar 18 14:23:04 crc kubenswrapper[4756]: E0318 14:23:04.286298 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f752fe8-a008-4737-873f-0ae42990431f" containerName="glance-httpd" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286303 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f752fe8-a008-4737-873f-0ae42990431f" containerName="glance-httpd" Mar 18 14:23:04 crc kubenswrapper[4756]: E0318 14:23:04.286313 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="sg-core" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286319 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="sg-core" Mar 18 14:23:04 crc kubenswrapper[4756]: E0318 14:23:04.286328 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ac7084-50e8-406f-90f5-2cf4d2350935" containerName="cinder-api-log" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286334 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ac7084-50e8-406f-90f5-2cf4d2350935" containerName="cinder-api-log" Mar 18 14:23:04 crc kubenswrapper[4756]: E0318 14:23:04.286348 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="660c0c11-4c3e-465d-a8e8-181dfda9f400" containerName="glance-httpd" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286354 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="660c0c11-4c3e-465d-a8e8-181dfda9f400" containerName="glance-httpd" Mar 18 14:23:04 crc kubenswrapper[4756]: E0318 14:23:04.286368 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="ceilometer-notification-agent" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286374 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="ceilometer-notification-agent" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286542 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="proxy-httpd" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286554 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="sg-core" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286562 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="660c0c11-4c3e-465d-a8e8-181dfda9f400" containerName="glance-log" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286576 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ac7084-50e8-406f-90f5-2cf4d2350935" containerName="cinder-api-log" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286586 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="660c0c11-4c3e-465d-a8e8-181dfda9f400" containerName="glance-httpd" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286602 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="ceilometer-central-agent" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286612 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f752fe8-a008-4737-873f-0ae42990431f" containerName="glance-log" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286619 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ac7084-50e8-406f-90f5-2cf4d2350935" containerName="cinder-api" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286638 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f752fe8-a008-4737-873f-0ae42990431f" containerName="glance-httpd" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.286644 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" containerName="ceilometer-notification-agent" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.288379 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.291804 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.325621 4756 scope.go:117] "RemoveContainer" containerID="2c9cadb470d69e4ba22f022cd3177e4dd2e4e7426da004bcf5d2250205c1d4f1" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.330826 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.332499 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-config-data" (OuterVolumeSpecName: "config-data") pod "660c0c11-4c3e-465d-a8e8-181dfda9f400" (UID: "660c0c11-4c3e-465d-a8e8-181dfda9f400"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.349865 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.370227 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-scripts\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.370352 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.370487 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.370526 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-config-data\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.370629 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfcdc\" (UniqueName: \"kubernetes.io/projected/3a5e6ffc-e769-4460-996d-134e5eed2153-kube-api-access-xfcdc\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.370750 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5e6ffc-e769-4460-996d-134e5eed2153-run-httpd\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.370775 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5e6ffc-e769-4460-996d-134e5eed2153-log-httpd\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.370939 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.370957 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/660c0c11-4c3e-465d-a8e8-181dfda9f400-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.371072 4756 scope.go:117] "RemoveContainer" containerID="544736d183a3cbf62f7ad025b7e67c4b647a9fe7b073a0f3ad059d28034fcae6" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.381189 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.390648 4756 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.390794 4756 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-565e4be1-2f3b-410a-808c-677fff515f0b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b") on node "crc" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.397530 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.406281 4756 scope.go:117] "RemoveContainer" containerID="7a34dbea7ba5ecf140e7fb7f99455f2180277a988725a32294c2b9450d52ab50" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.417318 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: W0318 14:23:04.417687 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1190e7a3_cde5_467e_adbc_b20c6f7823d5.slice/crio-5287b781f97b68b45eb633d93b2a9a0ecf0cbfd9bed4b0ee2a8930796bf915f7 WatchSource:0}: Error finding container 5287b781f97b68b45eb633d93b2a9a0ecf0cbfd9bed4b0ee2a8930796bf915f7: Status 404 returned error can't find the container with id 5287b781f97b68b45eb633d93b2a9a0ecf0cbfd9bed4b0ee2a8930796bf915f7 Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.419432 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.421693 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.421938 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.422055 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.439357 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.445382 4756 scope.go:117] "RemoveContainer" containerID="6cd2f7cec824393628fb88948e4665aad058ef37c40b05f1165872d6909f1e3a" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.447105 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.450951 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.462446 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-98s5z"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.472936 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5e6ffc-e769-4460-996d-134e5eed2153-run-httpd\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.472980 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5e6ffc-e769-4460-996d-134e5eed2153-log-httpd\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473002 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473075 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-scripts\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473102 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-config-data-custom\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473137 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/506de922-637d-4174-aeaa-236a27140466-etc-machine-id\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473155 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rd48\" (UniqueName: \"kubernetes.io/projected/506de922-637d-4174-aeaa-236a27140466-kube-api-access-5rd48\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473172 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473212 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473249 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-scripts\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473273 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473298 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-config-data\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473362 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfcdc\" (UniqueName: \"kubernetes.io/projected/3a5e6ffc-e769-4460-996d-134e5eed2153-kube-api-access-xfcdc\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473387 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-public-tls-certs\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473405 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-config-data\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473426 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506de922-637d-4174-aeaa-236a27140466-logs\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.473495 4756 reconciler_common.go:293] "Volume detached for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.476768 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5e6ffc-e769-4460-996d-134e5eed2153-run-httpd\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.476976 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5e6ffc-e769-4460-996d-134e5eed2153-log-httpd\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.481082 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9614-account-create-update-4784w"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.497629 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.499008 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ac52-account-create-update-kxhc4"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.516901 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-scripts\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.522568 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfcdc\" (UniqueName: \"kubernetes.io/projected/3a5e6ffc-e769-4460-996d-134e5eed2153-kube-api-access-xfcdc\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.537733 4756 scope.go:117] "RemoveContainer" containerID="aa31d4b7a655bec3dc2cc33e5a9a40a1fb83f5ff25db2e115ddea543f06c9dc7" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.554864 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.561985 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.565297 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-config-data\") pod \"ceilometer-0\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.573975 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.575044 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.575891 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.575936 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-scripts\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.576032 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-public-tls-certs\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.576060 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-config-data\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.576088 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506de922-637d-4174-aeaa-236a27140466-logs\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.576153 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.576245 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7454cc5499-pkq5t" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.576279 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-config-data-custom\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.576305 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/506de922-637d-4174-aeaa-236a27140466-etc-machine-id\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.576328 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rd48\" (UniqueName: \"kubernetes.io/projected/506de922-637d-4174-aeaa-236a27140466-kube-api-access-5rd48\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.577333 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506de922-637d-4174-aeaa-236a27140466-logs\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.578921 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/506de922-637d-4174-aeaa-236a27140466-etc-machine-id\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.580690 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-public-tls-certs\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.588974 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.593925 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rd48\" (UniqueName: \"kubernetes.io/projected/506de922-637d-4174-aeaa-236a27140466-kube-api-access-5rd48\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.599075 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.599191 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-scripts\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.599216 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.599520 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-config-data-custom\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.604181 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.606203 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.609835 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506de922-637d-4174-aeaa-236a27140466-config-data\") pod \"cinder-api-0\" (UID: \"506de922-637d-4174-aeaa-236a27140466\") " pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.613803 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.614678 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.614895 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.615085 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rwf89" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.615374 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.618631 4756 scope.go:117] "RemoveContainer" containerID="a7f0050fbff4930b6df477526c56e2770bd369a97e76593023a51fda2f706ae7" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.619463 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-f7zgs"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.629329 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.637423 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.647895 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.659072 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1f8d-account-create-update-47b8t"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.682221 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d168f1b-0779-47a8-8346-254ecfc9a126-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.682285 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d168f1b-0779-47a8-8346-254ecfc9a126-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.682373 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.682399 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94wwf\" (UniqueName: \"kubernetes.io/projected/2d168f1b-0779-47a8-8346-254ecfc9a126-kube-api-access-94wwf\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.682461 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d168f1b-0779-47a8-8346-254ecfc9a126-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.682550 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d168f1b-0779-47a8-8346-254ecfc9a126-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.682591 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d168f1b-0779-47a8-8346-254ecfc9a126-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.682642 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d168f1b-0779-47a8-8346-254ecfc9a126-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.684831 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.686777 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.689270 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.689404 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.698895 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.708233 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f9qhz"] Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.759080 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791035 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d168f1b-0779-47a8-8346-254ecfc9a126-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791102 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791171 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90742574-07ba-4265-aa05-59c9f557caf0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791208 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d168f1b-0779-47a8-8346-254ecfc9a126-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791241 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90742574-07ba-4265-aa05-59c9f557caf0-logs\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791259 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d168f1b-0779-47a8-8346-254ecfc9a126-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791287 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d168f1b-0779-47a8-8346-254ecfc9a126-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791353 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d168f1b-0779-47a8-8346-254ecfc9a126-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791373 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90742574-07ba-4265-aa05-59c9f557caf0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791409 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d168f1b-0779-47a8-8346-254ecfc9a126-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791475 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90742574-07ba-4265-aa05-59c9f557caf0-config-data\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791497 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791522 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94wwf\" (UniqueName: \"kubernetes.io/projected/2d168f1b-0779-47a8-8346-254ecfc9a126-kube-api-access-94wwf\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791553 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90742574-07ba-4265-aa05-59c9f557caf0-scripts\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791576 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmjqn\" (UniqueName: \"kubernetes.io/projected/90742574-07ba-4265-aa05-59c9f557caf0-kube-api-access-cmjqn\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.791593 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90742574-07ba-4265-aa05-59c9f557caf0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.792437 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d168f1b-0779-47a8-8346-254ecfc9a126-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.795657 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d168f1b-0779-47a8-8346-254ecfc9a126-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.801521 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d168f1b-0779-47a8-8346-254ecfc9a126-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.805596 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d168f1b-0779-47a8-8346-254ecfc9a126-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.807032 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d168f1b-0779-47a8-8346-254ecfc9a126-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.809590 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.809631 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/14728f060a3b5046b333048907b438ee0376fa68800afec942964a27fea1d4a8/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.831278 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d168f1b-0779-47a8-8346-254ecfc9a126-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.846804 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94wwf\" (UniqueName: \"kubernetes.io/projected/2d168f1b-0779-47a8-8346-254ecfc9a126-kube-api-access-94wwf\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.895619 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90742574-07ba-4265-aa05-59c9f557caf0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.895684 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.895732 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90742574-07ba-4265-aa05-59c9f557caf0-logs\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.895854 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90742574-07ba-4265-aa05-59c9f557caf0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.895930 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90742574-07ba-4265-aa05-59c9f557caf0-config-data\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.895956 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90742574-07ba-4265-aa05-59c9f557caf0-scripts\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.895978 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmjqn\" (UniqueName: \"kubernetes.io/projected/90742574-07ba-4265-aa05-59c9f557caf0-kube-api-access-cmjqn\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.895993 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90742574-07ba-4265-aa05-59c9f557caf0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.897815 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90742574-07ba-4265-aa05-59c9f557caf0-logs\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.915109 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90742574-07ba-4265-aa05-59c9f557caf0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.917689 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90742574-07ba-4265-aa05-59c9f557caf0-scripts\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.919292 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90742574-07ba-4265-aa05-59c9f557caf0-config-data\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.919854 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90742574-07ba-4265-aa05-59c9f557caf0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.928609 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90742574-07ba-4265-aa05-59c9f557caf0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.968963 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmjqn\" (UniqueName: \"kubernetes.io/projected/90742574-07ba-4265-aa05-59c9f557caf0-kube-api-access-cmjqn\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.969241 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:23:04 crc kubenswrapper[4756]: I0318 14:23:04.969264 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2d16b602ee6d280a584346f693cabe12df6f04b4d4e7d81a050a495635db21be/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.126949 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-565e4be1-2f3b-410a-808c-677fff515f0b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-565e4be1-2f3b-410a-808c-677fff515f0b\") pod \"glance-default-internal-api-0\" (UID: \"2d168f1b-0779-47a8-8346-254ecfc9a126\") " pod="openstack/glance-default-internal-api-0" Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.208182 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d44f36dc-e387-43e2-913e-de408349f817\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d44f36dc-e387-43e2-913e-de408349f817\") pod \"glance-default-external-api-0\" (UID: \"90742574-07ba-4265-aa05-59c9f557caf0\") " pod="openstack/glance-default-external-api-0" Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.215855 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f7zgs" event={"ID":"928231e9-14ab-4f87-85c0-37a372d3ed9d","Type":"ContainerStarted","Data":"8610ed7fc550c2cbe3412cc69e011eb7d22591a91aefc146acdedf96606d0862"} Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.246159 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f8d-account-create-update-47b8t" event={"ID":"3d1ea486-ed86-4ae5-9374-3538e9d1e4fc","Type":"ContainerStarted","Data":"ff2b2825a6904d0093a8d67f90b1d45c16f99134193d70b74fb306daafc51698"} Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.254968 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.275969 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac52-account-create-update-kxhc4" event={"ID":"5aabe4a2-969f-44f7-aa29-60db47845f80","Type":"ContainerStarted","Data":"8b7eca6876b9a4370e43708b5243c2e4485c683063d1ee981fc21553c5f91767"} Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.285845 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f9qhz" event={"ID":"8c62dc13-d167-48da-909e-eff8ca5852f4","Type":"ContainerStarted","Data":"20c153eea46efab0c8523510faa63d8fdcfba982141dde7debf3e0682736ca1a"} Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.297032 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-98s5z" event={"ID":"1190e7a3-cde5-467e-adbc-b20c6f7823d5","Type":"ContainerStarted","Data":"ff3106128bda59043bbdbf5e0778ab558813dd81e0e44b3f12da9fb2361dea8d"} Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.297068 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-98s5z" event={"ID":"1190e7a3-cde5-467e-adbc-b20c6f7823d5","Type":"ContainerStarted","Data":"5287b781f97b68b45eb633d93b2a9a0ecf0cbfd9bed4b0ee2a8930796bf915f7"} Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.312445 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9614-account-create-update-4784w" event={"ID":"dc4adb0e-915e-4b69-ad6f-2f510f53e2e8","Type":"ContainerStarted","Data":"bc4f1ae5d17e659d00cb9fd4ffefe06d21b9bacbadef727291c2f45bea1fcdc7"} Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.313223 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.336383 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-98s5z" podStartSLOduration=7.336367793 podStartE2EDuration="7.336367793s" podCreationTimestamp="2026-03-18 14:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:23:05.320138755 +0000 UTC m=+1386.634556730" watchObservedRunningTime="2026-03-18 14:23:05.336367793 +0000 UTC m=+1386.650785768" Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.378981 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f752fe8-a008-4737-873f-0ae42990431f" path="/var/lib/kubelet/pods/2f752fe8-a008-4737-873f-0ae42990431f/volumes" Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.383446 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a791a6a-0e50-465a-90cf-e4af5bdc12de" path="/var/lib/kubelet/pods/4a791a6a-0e50-465a-90cf-e4af5bdc12de/volumes" Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.388304 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="660c0c11-4c3e-465d-a8e8-181dfda9f400" path="/var/lib/kubelet/pods/660c0c11-4c3e-465d-a8e8-181dfda9f400/volumes" Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.388990 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3ac7084-50e8-406f-90f5-2cf4d2350935" path="/var/lib/kubelet/pods/f3ac7084-50e8-406f-90f5-2cf4d2350935/volumes" Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.689035 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:05 crc kubenswrapper[4756]: I0318 14:23:05.879183 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.081151 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.164111 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.391700 4756 generic.go:334] "Generic (PLEG): container finished" podID="dc4adb0e-915e-4b69-ad6f-2f510f53e2e8" containerID="17e85402eccc18bc2e601485967e35c113bbd4370f6b7df22124dd9024330ee9" exitCode=0 Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.391762 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9614-account-create-update-4784w" event={"ID":"dc4adb0e-915e-4b69-ad6f-2f510f53e2e8","Type":"ContainerDied","Data":"17e85402eccc18bc2e601485967e35c113bbd4370f6b7df22124dd9024330ee9"} Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.394946 4756 generic.go:334] "Generic (PLEG): container finished" podID="928231e9-14ab-4f87-85c0-37a372d3ed9d" containerID="d38503af1d576fa6cca741f50a2e53337bdf196bf6690d4b3414bc8615900fd5" exitCode=0 Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.395022 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f7zgs" event={"ID":"928231e9-14ab-4f87-85c0-37a372d3ed9d","Type":"ContainerDied","Data":"d38503af1d576fa6cca741f50a2e53337bdf196bf6690d4b3414bc8615900fd5"} Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.402442 4756 generic.go:334] "Generic (PLEG): container finished" podID="3d1ea486-ed86-4ae5-9374-3538e9d1e4fc" containerID="02c4177cfb2feb7265bb8b4f7d30aea3746c3361258770026ad9235036f215ac" exitCode=0 Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.402504 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f8d-account-create-update-47b8t" event={"ID":"3d1ea486-ed86-4ae5-9374-3538e9d1e4fc","Type":"ContainerDied","Data":"02c4177cfb2feb7265bb8b4f7d30aea3746c3361258770026ad9235036f215ac"} Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.414023 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"506de922-637d-4174-aeaa-236a27140466","Type":"ContainerStarted","Data":"33a5ced135a86b67ca6bf8b24a422cb89315e76a133c270f1c8756975e920420"} Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.421141 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"90742574-07ba-4265-aa05-59c9f557caf0","Type":"ContainerStarted","Data":"4c0e010d7dd117b0668951e3716c0ea82bdfb97679762f2faa4ba9de4da59567"} Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.426187 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f9qhz" event={"ID":"8c62dc13-d167-48da-909e-eff8ca5852f4","Type":"ContainerDied","Data":"7e98aefca504c4b58fa01acbc93d1b2ff2866bf22a5a2b4b1e64c5cd365a6f5f"} Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.426108 4756 generic.go:334] "Generic (PLEG): container finished" podID="8c62dc13-d167-48da-909e-eff8ca5852f4" containerID="7e98aefca504c4b58fa01acbc93d1b2ff2866bf22a5a2b4b1e64c5cd365a6f5f" exitCode=0 Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.429955 4756 generic.go:334] "Generic (PLEG): container finished" podID="1190e7a3-cde5-467e-adbc-b20c6f7823d5" containerID="ff3106128bda59043bbdbf5e0778ab558813dd81e0e44b3f12da9fb2361dea8d" exitCode=0 Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.430027 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-98s5z" event={"ID":"1190e7a3-cde5-467e-adbc-b20c6f7823d5","Type":"ContainerDied","Data":"ff3106128bda59043bbdbf5e0778ab558813dd81e0e44b3f12da9fb2361dea8d"} Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.437107 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5e6ffc-e769-4460-996d-134e5eed2153","Type":"ContainerStarted","Data":"5087fb1586f86a0f2037d33cb3cb00af939b836477a9d451be08920bef8ff937"} Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.441179 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d168f1b-0779-47a8-8346-254ecfc9a126","Type":"ContainerStarted","Data":"c7715917cc41db69ec3e483823fb5f05a3e69be705a69b6d06feecd383d7ecf5"} Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.444238 4756 generic.go:334] "Generic (PLEG): container finished" podID="5aabe4a2-969f-44f7-aa29-60db47845f80" containerID="1b46a550f07d89c90c84d54479874f5949ef2f0e0cb0872c60f476d80993bef2" exitCode=0 Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.444299 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac52-account-create-update-kxhc4" event={"ID":"5aabe4a2-969f-44f7-aa29-60db47845f80","Type":"ContainerDied","Data":"1b46a550f07d89c90c84d54479874f5949ef2f0e0cb0872c60f476d80993bef2"} Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.474702 4756 generic.go:334] "Generic (PLEG): container finished" podID="e7ade8da-71b8-402b-8b7c-d2b333ab31da" containerID="720b57841e6438cde31ac56c82d9321f65025de2d66a3d41576fa3ab95840aaa" exitCode=0 Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.474757 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56db767868-svvqr" event={"ID":"e7ade8da-71b8-402b-8b7c-d2b333ab31da","Type":"ContainerDied","Data":"720b57841e6438cde31ac56c82d9321f65025de2d66a3d41576fa3ab95840aaa"} Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.695973 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56db767868-svvqr" Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.862258 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjflb\" (UniqueName: \"kubernetes.io/projected/e7ade8da-71b8-402b-8b7c-d2b333ab31da-kube-api-access-kjflb\") pod \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.862364 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-httpd-config\") pod \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.862467 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-config\") pod \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.862503 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-combined-ca-bundle\") pod \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.862548 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-ovndb-tls-certs\") pod \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\" (UID: \"e7ade8da-71b8-402b-8b7c-d2b333ab31da\") " Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.873027 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e7ade8da-71b8-402b-8b7c-d2b333ab31da" (UID: "e7ade8da-71b8-402b-8b7c-d2b333ab31da"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.873088 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7ade8da-71b8-402b-8b7c-d2b333ab31da-kube-api-access-kjflb" (OuterVolumeSpecName: "kube-api-access-kjflb") pod "e7ade8da-71b8-402b-8b7c-d2b333ab31da" (UID: "e7ade8da-71b8-402b-8b7c-d2b333ab31da"). InnerVolumeSpecName "kube-api-access-kjflb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.926695 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-config" (OuterVolumeSpecName: "config") pod "e7ade8da-71b8-402b-8b7c-d2b333ab31da" (UID: "e7ade8da-71b8-402b-8b7c-d2b333ab31da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.959586 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e7ade8da-71b8-402b-8b7c-d2b333ab31da" (UID: "e7ade8da-71b8-402b-8b7c-d2b333ab31da"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.964570 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjflb\" (UniqueName: \"kubernetes.io/projected/e7ade8da-71b8-402b-8b7c-d2b333ab31da-kube-api-access-kjflb\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.964593 4756 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.964602 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.964611 4756 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:06 crc kubenswrapper[4756]: I0318 14:23:06.966131 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7ade8da-71b8-402b-8b7c-d2b333ab31da" (UID: "e7ade8da-71b8-402b-8b7c-d2b333ab31da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:07 crc kubenswrapper[4756]: I0318 14:23:07.038311 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:07 crc kubenswrapper[4756]: I0318 14:23:07.066951 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ade8da-71b8-402b-8b7c-d2b333ab31da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:07 crc kubenswrapper[4756]: I0318 14:23:07.510194 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56db767868-svvqr" event={"ID":"e7ade8da-71b8-402b-8b7c-d2b333ab31da","Type":"ContainerDied","Data":"97c4db9b33d5d8ccd826f7cf87ccc52bc21efb116f44c0aa1b1d1d59f7c090cf"} Mar 18 14:23:07 crc kubenswrapper[4756]: I0318 14:23:07.510245 4756 scope.go:117] "RemoveContainer" containerID="c81c552cccbd366bde29189352ac386f636c21f02ae547e20dd8d50594724edd" Mar 18 14:23:07 crc kubenswrapper[4756]: I0318 14:23:07.511758 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56db767868-svvqr" Mar 18 14:23:07 crc kubenswrapper[4756]: I0318 14:23:07.553256 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56db767868-svvqr"] Mar 18 14:23:07 crc kubenswrapper[4756]: I0318 14:23:07.565053 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56db767868-svvqr"] Mar 18 14:23:07 crc kubenswrapper[4756]: I0318 14:23:07.580185 4756 scope.go:117] "RemoveContainer" containerID="720b57841e6438cde31ac56c82d9321f65025de2d66a3d41576fa3ab95840aaa" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.077611 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f8d-account-create-update-47b8t" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.222147 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd8dh\" (UniqueName: \"kubernetes.io/projected/3d1ea486-ed86-4ae5-9374-3538e9d1e4fc-kube-api-access-cd8dh\") pod \"3d1ea486-ed86-4ae5-9374-3538e9d1e4fc\" (UID: \"3d1ea486-ed86-4ae5-9374-3538e9d1e4fc\") " Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.225145 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1ea486-ed86-4ae5-9374-3538e9d1e4fc-operator-scripts\") pod \"3d1ea486-ed86-4ae5-9374-3538e9d1e4fc\" (UID: \"3d1ea486-ed86-4ae5-9374-3538e9d1e4fc\") " Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.226910 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d1ea486-ed86-4ae5-9374-3538e9d1e4fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d1ea486-ed86-4ae5-9374-3538e9d1e4fc" (UID: "3d1ea486-ed86-4ae5-9374-3538e9d1e4fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.232783 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1ea486-ed86-4ae5-9374-3538e9d1e4fc-kube-api-access-cd8dh" (OuterVolumeSpecName: "kube-api-access-cd8dh") pod "3d1ea486-ed86-4ae5-9374-3538e9d1e4fc" (UID: "3d1ea486-ed86-4ae5-9374-3538e9d1e4fc"). InnerVolumeSpecName "kube-api-access-cd8dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.330565 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1ea486-ed86-4ae5-9374-3538e9d1e4fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.332332 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd8dh\" (UniqueName: \"kubernetes.io/projected/3d1ea486-ed86-4ae5-9374-3538e9d1e4fc-kube-api-access-cd8dh\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.457137 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-98s5z" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.479905 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac52-account-create-update-kxhc4" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.497933 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f7zgs" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.517770 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9614-account-create-update-4784w" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.527228 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f9qhz" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.548213 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"90742574-07ba-4265-aa05-59c9f557caf0","Type":"ContainerStarted","Data":"e76692e4fb079ed793cbecf9687345cac0535eb811e66bbf795c5752ead5f0e5"} Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.553910 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d168f1b-0779-47a8-8346-254ecfc9a126","Type":"ContainerStarted","Data":"586a9b918b196ffa2f54dbf2193ac5ec135354ec2c57c3725f0174d7c5824b7e"} Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.558477 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-98s5z" event={"ID":"1190e7a3-cde5-467e-adbc-b20c6f7823d5","Type":"ContainerDied","Data":"5287b781f97b68b45eb633d93b2a9a0ecf0cbfd9bed4b0ee2a8930796bf915f7"} Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.558520 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5287b781f97b68b45eb633d93b2a9a0ecf0cbfd9bed4b0ee2a8930796bf915f7" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.558583 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-98s5z" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.561264 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5e6ffc-e769-4460-996d-134e5eed2153","Type":"ContainerStarted","Data":"d16ba12a9c73df48af38ac83cd1d37e3f4f175c56831674025bd9fb8bcea932e"} Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.585400 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9614-account-create-update-4784w" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.585367 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9614-account-create-update-4784w" event={"ID":"dc4adb0e-915e-4b69-ad6f-2f510f53e2e8","Type":"ContainerDied","Data":"bc4f1ae5d17e659d00cb9fd4ffefe06d21b9bacbadef727291c2f45bea1fcdc7"} Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.590194 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc4f1ae5d17e659d00cb9fd4ffefe06d21b9bacbadef727291c2f45bea1fcdc7" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.596435 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"506de922-637d-4174-aeaa-236a27140466","Type":"ContainerStarted","Data":"bc2335fbfd4b84006e4bbab5ec73ab6c77611de4677a91a7db44b87ecd92ba99"} Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.602301 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac52-account-create-update-kxhc4" event={"ID":"5aabe4a2-969f-44f7-aa29-60db47845f80","Type":"ContainerDied","Data":"8b7eca6876b9a4370e43708b5243c2e4485c683063d1ee981fc21553c5f91767"} Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.602571 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b7eca6876b9a4370e43708b5243c2e4485c683063d1ee981fc21553c5f91767" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.602523 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac52-account-create-update-kxhc4" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.611987 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f7zgs" event={"ID":"928231e9-14ab-4f87-85c0-37a372d3ed9d","Type":"ContainerDied","Data":"8610ed7fc550c2cbe3412cc69e011eb7d22591a91aefc146acdedf96606d0862"} Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.612243 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8610ed7fc550c2cbe3412cc69e011eb7d22591a91aefc146acdedf96606d0862" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.612413 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f7zgs" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.615232 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f8d-account-create-update-47b8t" event={"ID":"3d1ea486-ed86-4ae5-9374-3538e9d1e4fc","Type":"ContainerDied","Data":"ff2b2825a6904d0093a8d67f90b1d45c16f99134193d70b74fb306daafc51698"} Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.615362 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff2b2825a6904d0093a8d67f90b1d45c16f99134193d70b74fb306daafc51698" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.615522 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f8d-account-create-update-47b8t" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.620881 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f9qhz" event={"ID":"8c62dc13-d167-48da-909e-eff8ca5852f4","Type":"ContainerDied","Data":"20c153eea46efab0c8523510faa63d8fdcfba982141dde7debf3e0682736ca1a"} Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.620931 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20c153eea46efab0c8523510faa63d8fdcfba982141dde7debf3e0682736ca1a" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.620941 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f9qhz" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.645293 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktcj5\" (UniqueName: \"kubernetes.io/projected/5aabe4a2-969f-44f7-aa29-60db47845f80-kube-api-access-ktcj5\") pod \"5aabe4a2-969f-44f7-aa29-60db47845f80\" (UID: \"5aabe4a2-969f-44f7-aa29-60db47845f80\") " Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.645354 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plbgt\" (UniqueName: \"kubernetes.io/projected/1190e7a3-cde5-467e-adbc-b20c6f7823d5-kube-api-access-plbgt\") pod \"1190e7a3-cde5-467e-adbc-b20c6f7823d5\" (UID: \"1190e7a3-cde5-467e-adbc-b20c6f7823d5\") " Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.645377 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4adb0e-915e-4b69-ad6f-2f510f53e2e8-operator-scripts\") pod \"dc4adb0e-915e-4b69-ad6f-2f510f53e2e8\" (UID: \"dc4adb0e-915e-4b69-ad6f-2f510f53e2e8\") " Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.645411 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wntc4\" (UniqueName: \"kubernetes.io/projected/8c62dc13-d167-48da-909e-eff8ca5852f4-kube-api-access-wntc4\") pod \"8c62dc13-d167-48da-909e-eff8ca5852f4\" (UID: \"8c62dc13-d167-48da-909e-eff8ca5852f4\") " Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.645457 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/928231e9-14ab-4f87-85c0-37a372d3ed9d-operator-scripts\") pod \"928231e9-14ab-4f87-85c0-37a372d3ed9d\" (UID: \"928231e9-14ab-4f87-85c0-37a372d3ed9d\") " Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.645611 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1190e7a3-cde5-467e-adbc-b20c6f7823d5-operator-scripts\") pod \"1190e7a3-cde5-467e-adbc-b20c6f7823d5\" (UID: \"1190e7a3-cde5-467e-adbc-b20c6f7823d5\") " Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.645638 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwwjn\" (UniqueName: \"kubernetes.io/projected/928231e9-14ab-4f87-85c0-37a372d3ed9d-kube-api-access-mwwjn\") pod \"928231e9-14ab-4f87-85c0-37a372d3ed9d\" (UID: \"928231e9-14ab-4f87-85c0-37a372d3ed9d\") " Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.645676 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c62dc13-d167-48da-909e-eff8ca5852f4-operator-scripts\") pod \"8c62dc13-d167-48da-909e-eff8ca5852f4\" (UID: \"8c62dc13-d167-48da-909e-eff8ca5852f4\") " Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.645789 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9hgx\" (UniqueName: \"kubernetes.io/projected/dc4adb0e-915e-4b69-ad6f-2f510f53e2e8-kube-api-access-z9hgx\") pod \"dc4adb0e-915e-4b69-ad6f-2f510f53e2e8\" (UID: \"dc4adb0e-915e-4b69-ad6f-2f510f53e2e8\") " Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.645851 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aabe4a2-969f-44f7-aa29-60db47845f80-operator-scripts\") pod \"5aabe4a2-969f-44f7-aa29-60db47845f80\" (UID: \"5aabe4a2-969f-44f7-aa29-60db47845f80\") " Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.646621 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928231e9-14ab-4f87-85c0-37a372d3ed9d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "928231e9-14ab-4f87-85c0-37a372d3ed9d" (UID: "928231e9-14ab-4f87-85c0-37a372d3ed9d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.647907 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aabe4a2-969f-44f7-aa29-60db47845f80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5aabe4a2-969f-44f7-aa29-60db47845f80" (UID: "5aabe4a2-969f-44f7-aa29-60db47845f80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.648473 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1190e7a3-cde5-467e-adbc-b20c6f7823d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1190e7a3-cde5-467e-adbc-b20c6f7823d5" (UID: "1190e7a3-cde5-467e-adbc-b20c6f7823d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.648505 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c62dc13-d167-48da-909e-eff8ca5852f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c62dc13-d167-48da-909e-eff8ca5852f4" (UID: "8c62dc13-d167-48da-909e-eff8ca5852f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.649156 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4adb0e-915e-4b69-ad6f-2f510f53e2e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc4adb0e-915e-4b69-ad6f-2f510f53e2e8" (UID: "dc4adb0e-915e-4b69-ad6f-2f510f53e2e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.652298 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c62dc13-d167-48da-909e-eff8ca5852f4-kube-api-access-wntc4" (OuterVolumeSpecName: "kube-api-access-wntc4") pod "8c62dc13-d167-48da-909e-eff8ca5852f4" (UID: "8c62dc13-d167-48da-909e-eff8ca5852f4"). InnerVolumeSpecName "kube-api-access-wntc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.653209 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aabe4a2-969f-44f7-aa29-60db47845f80-kube-api-access-ktcj5" (OuterVolumeSpecName: "kube-api-access-ktcj5") pod "5aabe4a2-969f-44f7-aa29-60db47845f80" (UID: "5aabe4a2-969f-44f7-aa29-60db47845f80"). InnerVolumeSpecName "kube-api-access-ktcj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.655278 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1190e7a3-cde5-467e-adbc-b20c6f7823d5-kube-api-access-plbgt" (OuterVolumeSpecName: "kube-api-access-plbgt") pod "1190e7a3-cde5-467e-adbc-b20c6f7823d5" (UID: "1190e7a3-cde5-467e-adbc-b20c6f7823d5"). InnerVolumeSpecName "kube-api-access-plbgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.661050 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc4adb0e-915e-4b69-ad6f-2f510f53e2e8-kube-api-access-z9hgx" (OuterVolumeSpecName: "kube-api-access-z9hgx") pod "dc4adb0e-915e-4b69-ad6f-2f510f53e2e8" (UID: "dc4adb0e-915e-4b69-ad6f-2f510f53e2e8"). InnerVolumeSpecName "kube-api-access-z9hgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.661330 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928231e9-14ab-4f87-85c0-37a372d3ed9d-kube-api-access-mwwjn" (OuterVolumeSpecName: "kube-api-access-mwwjn") pod "928231e9-14ab-4f87-85c0-37a372d3ed9d" (UID: "928231e9-14ab-4f87-85c0-37a372d3ed9d"). InnerVolumeSpecName "kube-api-access-mwwjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.747868 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9hgx\" (UniqueName: \"kubernetes.io/projected/dc4adb0e-915e-4b69-ad6f-2f510f53e2e8-kube-api-access-z9hgx\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.747905 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aabe4a2-969f-44f7-aa29-60db47845f80-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.747916 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktcj5\" (UniqueName: \"kubernetes.io/projected/5aabe4a2-969f-44f7-aa29-60db47845f80-kube-api-access-ktcj5\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.747926 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plbgt\" (UniqueName: \"kubernetes.io/projected/1190e7a3-cde5-467e-adbc-b20c6f7823d5-kube-api-access-plbgt\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.747936 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4adb0e-915e-4b69-ad6f-2f510f53e2e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.747945 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wntc4\" (UniqueName: \"kubernetes.io/projected/8c62dc13-d167-48da-909e-eff8ca5852f4-kube-api-access-wntc4\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.747953 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/928231e9-14ab-4f87-85c0-37a372d3ed9d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.747962 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1190e7a3-cde5-467e-adbc-b20c6f7823d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.747970 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwwjn\" (UniqueName: \"kubernetes.io/projected/928231e9-14ab-4f87-85c0-37a372d3ed9d-kube-api-access-mwwjn\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:08 crc kubenswrapper[4756]: I0318 14:23:08.747978 4756 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c62dc13-d167-48da-909e-eff8ca5852f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:09 crc kubenswrapper[4756]: I0318 14:23:09.336239 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7ade8da-71b8-402b-8b7c-d2b333ab31da" path="/var/lib/kubelet/pods/e7ade8da-71b8-402b-8b7c-d2b333ab31da/volumes" Mar 18 14:23:09 crc kubenswrapper[4756]: I0318 14:23:09.636642 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"90742574-07ba-4265-aa05-59c9f557caf0","Type":"ContainerStarted","Data":"cd0960d9bcfb8a12b76005a4b34d9aaf3efcf5f5762bf0da8d915bdd9d2864ec"} Mar 18 14:23:09 crc kubenswrapper[4756]: I0318 14:23:09.638053 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d168f1b-0779-47a8-8346-254ecfc9a126","Type":"ContainerStarted","Data":"11f159710b94074cabc769faf5eda89a2e2ea985d9eeb6b50ca631a9a45f0280"} Mar 18 14:23:09 crc kubenswrapper[4756]: I0318 14:23:09.641247 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5e6ffc-e769-4460-996d-134e5eed2153","Type":"ContainerStarted","Data":"dbb17aa335803875ca16acd83ac40593e6577d6053cd4c4ef7422810579383d6"} Mar 18 14:23:09 crc kubenswrapper[4756]: I0318 14:23:09.646369 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"506de922-637d-4174-aeaa-236a27140466","Type":"ContainerStarted","Data":"a798fdd918b8c74b4bf0fce9c3b0e4b846b273ec4f26e64a2712bd7a6d7a8212"} Mar 18 14:23:09 crc kubenswrapper[4756]: I0318 14:23:09.646698 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 14:23:09 crc kubenswrapper[4756]: I0318 14:23:09.665649 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.665622673 podStartE2EDuration="5.665622673s" podCreationTimestamp="2026-03-18 14:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:23:09.659299461 +0000 UTC m=+1390.973717426" watchObservedRunningTime="2026-03-18 14:23:09.665622673 +0000 UTC m=+1390.980040648" Mar 18 14:23:09 crc kubenswrapper[4756]: I0318 14:23:09.697197 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.697175825 podStartE2EDuration="5.697175825s" podCreationTimestamp="2026-03-18 14:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:23:09.68514552 +0000 UTC m=+1390.999563495" watchObservedRunningTime="2026-03-18 14:23:09.697175825 +0000 UTC m=+1391.011593800" Mar 18 14:23:09 crc kubenswrapper[4756]: I0318 14:23:09.724078 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.724060003 podStartE2EDuration="5.724060003s" podCreationTimestamp="2026-03-18 14:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:23:09.709023426 +0000 UTC m=+1391.023441401" watchObservedRunningTime="2026-03-18 14:23:09.724060003 +0000 UTC m=+1391.038477978" Mar 18 14:23:10 crc kubenswrapper[4756]: I0318 14:23:10.660921 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5e6ffc-e769-4460-996d-134e5eed2153","Type":"ContainerStarted","Data":"df97eeb0f03931abdbb133fc9f65ab7f44c07133450b942465efeb4da3117bf9"} Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.148073 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k4zq4"] Mar 18 14:23:14 crc kubenswrapper[4756]: E0318 14:23:14.149017 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ade8da-71b8-402b-8b7c-d2b333ab31da" containerName="neutron-api" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149032 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ade8da-71b8-402b-8b7c-d2b333ab31da" containerName="neutron-api" Mar 18 14:23:14 crc kubenswrapper[4756]: E0318 14:23:14.149042 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ade8da-71b8-402b-8b7c-d2b333ab31da" containerName="neutron-httpd" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149048 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ade8da-71b8-402b-8b7c-d2b333ab31da" containerName="neutron-httpd" Mar 18 14:23:14 crc kubenswrapper[4756]: E0318 14:23:14.149060 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aabe4a2-969f-44f7-aa29-60db47845f80" containerName="mariadb-account-create-update" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149066 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aabe4a2-969f-44f7-aa29-60db47845f80" containerName="mariadb-account-create-update" Mar 18 14:23:14 crc kubenswrapper[4756]: E0318 14:23:14.149083 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1ea486-ed86-4ae5-9374-3538e9d1e4fc" containerName="mariadb-account-create-update" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149090 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1ea486-ed86-4ae5-9374-3538e9d1e4fc" containerName="mariadb-account-create-update" Mar 18 14:23:14 crc kubenswrapper[4756]: E0318 14:23:14.149109 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4adb0e-915e-4b69-ad6f-2f510f53e2e8" containerName="mariadb-account-create-update" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149131 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4adb0e-915e-4b69-ad6f-2f510f53e2e8" containerName="mariadb-account-create-update" Mar 18 14:23:14 crc kubenswrapper[4756]: E0318 14:23:14.149150 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928231e9-14ab-4f87-85c0-37a372d3ed9d" containerName="mariadb-database-create" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149156 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="928231e9-14ab-4f87-85c0-37a372d3ed9d" containerName="mariadb-database-create" Mar 18 14:23:14 crc kubenswrapper[4756]: E0318 14:23:14.149168 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1190e7a3-cde5-467e-adbc-b20c6f7823d5" containerName="mariadb-database-create" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149174 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1190e7a3-cde5-467e-adbc-b20c6f7823d5" containerName="mariadb-database-create" Mar 18 14:23:14 crc kubenswrapper[4756]: E0318 14:23:14.149184 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c62dc13-d167-48da-909e-eff8ca5852f4" containerName="mariadb-database-create" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149190 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c62dc13-d167-48da-909e-eff8ca5852f4" containerName="mariadb-database-create" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149362 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7ade8da-71b8-402b-8b7c-d2b333ab31da" containerName="neutron-api" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149370 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c62dc13-d167-48da-909e-eff8ca5852f4" containerName="mariadb-database-create" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149387 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aabe4a2-969f-44f7-aa29-60db47845f80" containerName="mariadb-account-create-update" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149401 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1190e7a3-cde5-467e-adbc-b20c6f7823d5" containerName="mariadb-database-create" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149410 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="928231e9-14ab-4f87-85c0-37a372d3ed9d" containerName="mariadb-database-create" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149417 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7ade8da-71b8-402b-8b7c-d2b333ab31da" containerName="neutron-httpd" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149432 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4adb0e-915e-4b69-ad6f-2f510f53e2e8" containerName="mariadb-account-create-update" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.149442 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1ea486-ed86-4ae5-9374-3538e9d1e4fc" containerName="mariadb-account-create-update" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.150082 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.152502 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zxp4v" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.152611 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.152764 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.177738 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k4zq4"] Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.203998 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5e6ffc-e769-4460-996d-134e5eed2153","Type":"ContainerStarted","Data":"8aa5aa1fcf6f2bd1c77b59cf7a52f09eab06920799784686e29caba830b0e25d"} Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.204218 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="ceilometer-central-agent" containerID="cri-o://d16ba12a9c73df48af38ac83cd1d37e3f4f175c56831674025bd9fb8bcea932e" gracePeriod=30 Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.204504 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.204534 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="ceilometer-notification-agent" containerID="cri-o://dbb17aa335803875ca16acd83ac40593e6577d6053cd4c4ef7422810579383d6" gracePeriod=30 Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.204573 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="proxy-httpd" containerID="cri-o://8aa5aa1fcf6f2bd1c77b59cf7a52f09eab06920799784686e29caba830b0e25d" gracePeriod=30 Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.204539 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="sg-core" containerID="cri-o://df97eeb0f03931abdbb133fc9f65ab7f44c07133450b942465efeb4da3117bf9" gracePeriod=30 Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.244239 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.7827407109999998 podStartE2EDuration="10.244213823s" podCreationTimestamp="2026-03-18 14:23:04 +0000 UTC" firstStartedPulling="2026-03-18 14:23:05.721168688 +0000 UTC m=+1387.035586663" lastFinishedPulling="2026-03-18 14:23:12.1826418 +0000 UTC m=+1393.497059775" observedRunningTime="2026-03-18 14:23:14.231781517 +0000 UTC m=+1395.546199502" watchObservedRunningTime="2026-03-18 14:23:14.244213823 +0000 UTC m=+1395.558631798" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.267432 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-scripts\") pod \"nova-cell0-conductor-db-sync-k4zq4\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.267537 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlwhn\" (UniqueName: \"kubernetes.io/projected/73e64ca2-cc29-4a74-a970-b127ec0380f1-kube-api-access-qlwhn\") pod \"nova-cell0-conductor-db-sync-k4zq4\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.267649 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-config-data\") pod \"nova-cell0-conductor-db-sync-k4zq4\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.267772 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k4zq4\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.370090 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlwhn\" (UniqueName: \"kubernetes.io/projected/73e64ca2-cc29-4a74-a970-b127ec0380f1-kube-api-access-qlwhn\") pod \"nova-cell0-conductor-db-sync-k4zq4\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.370235 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-config-data\") pod \"nova-cell0-conductor-db-sync-k4zq4\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.370268 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k4zq4\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.370301 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-scripts\") pod \"nova-cell0-conductor-db-sync-k4zq4\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.378646 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k4zq4\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.379089 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-config-data\") pod \"nova-cell0-conductor-db-sync-k4zq4\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.387765 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-scripts\") pod \"nova-cell0-conductor-db-sync-k4zq4\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.387860 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlwhn\" (UniqueName: \"kubernetes.io/projected/73e64ca2-cc29-4a74-a970-b127ec0380f1-kube-api-access-qlwhn\") pod \"nova-cell0-conductor-db-sync-k4zq4\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.471691 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:14 crc kubenswrapper[4756]: W0318 14:23:14.976070 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73e64ca2_cc29_4a74_a970_b127ec0380f1.slice/crio-98f0fba6f39b3644919a984424daeffd603203588b23500a0b97310782e431f8 WatchSource:0}: Error finding container 98f0fba6f39b3644919a984424daeffd603203588b23500a0b97310782e431f8: Status 404 returned error can't find the container with id 98f0fba6f39b3644919a984424daeffd603203588b23500a0b97310782e431f8 Mar 18 14:23:14 crc kubenswrapper[4756]: I0318 14:23:14.979830 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k4zq4"] Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.227345 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k4zq4" event={"ID":"73e64ca2-cc29-4a74-a970-b127ec0380f1","Type":"ContainerStarted","Data":"98f0fba6f39b3644919a984424daeffd603203588b23500a0b97310782e431f8"} Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.231475 4756 generic.go:334] "Generic (PLEG): container finished" podID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerID="8aa5aa1fcf6f2bd1c77b59cf7a52f09eab06920799784686e29caba830b0e25d" exitCode=0 Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.231517 4756 generic.go:334] "Generic (PLEG): container finished" podID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerID="df97eeb0f03931abdbb133fc9f65ab7f44c07133450b942465efeb4da3117bf9" exitCode=2 Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.231526 4756 generic.go:334] "Generic (PLEG): container finished" podID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerID="dbb17aa335803875ca16acd83ac40593e6577d6053cd4c4ef7422810579383d6" exitCode=0 Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.231536 4756 generic.go:334] "Generic (PLEG): container finished" podID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerID="d16ba12a9c73df48af38ac83cd1d37e3f4f175c56831674025bd9fb8bcea932e" exitCode=0 Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.231565 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5e6ffc-e769-4460-996d-134e5eed2153","Type":"ContainerDied","Data":"8aa5aa1fcf6f2bd1c77b59cf7a52f09eab06920799784686e29caba830b0e25d"} Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.231594 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5e6ffc-e769-4460-996d-134e5eed2153","Type":"ContainerDied","Data":"df97eeb0f03931abdbb133fc9f65ab7f44c07133450b942465efeb4da3117bf9"} Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.231633 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5e6ffc-e769-4460-996d-134e5eed2153","Type":"ContainerDied","Data":"dbb17aa335803875ca16acd83ac40593e6577d6053cd4c4ef7422810579383d6"} Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.231647 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5e6ffc-e769-4460-996d-134e5eed2153","Type":"ContainerDied","Data":"d16ba12a9c73df48af38ac83cd1d37e3f4f175c56831674025bd9fb8bcea932e"} Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.255627 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.255672 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.258183 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.304432 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.314568 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.314609 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.368736 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.368798 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.378608 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.391855 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5e6ffc-e769-4460-996d-134e5eed2153-log-httpd\") pod \"3a5e6ffc-e769-4460-996d-134e5eed2153\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.391926 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-sg-core-conf-yaml\") pod \"3a5e6ffc-e769-4460-996d-134e5eed2153\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.391953 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-scripts\") pod \"3a5e6ffc-e769-4460-996d-134e5eed2153\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.392057 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-combined-ca-bundle\") pod \"3a5e6ffc-e769-4460-996d-134e5eed2153\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.392095 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-config-data\") pod \"3a5e6ffc-e769-4460-996d-134e5eed2153\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.392178 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfcdc\" (UniqueName: \"kubernetes.io/projected/3a5e6ffc-e769-4460-996d-134e5eed2153-kube-api-access-xfcdc\") pod \"3a5e6ffc-e769-4460-996d-134e5eed2153\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.392207 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5e6ffc-e769-4460-996d-134e5eed2153-run-httpd\") pod \"3a5e6ffc-e769-4460-996d-134e5eed2153\" (UID: \"3a5e6ffc-e769-4460-996d-134e5eed2153\") " Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.393965 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a5e6ffc-e769-4460-996d-134e5eed2153-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3a5e6ffc-e769-4460-996d-134e5eed2153" (UID: "3a5e6ffc-e769-4460-996d-134e5eed2153"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.394761 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a5e6ffc-e769-4460-996d-134e5eed2153-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3a5e6ffc-e769-4460-996d-134e5eed2153" (UID: "3a5e6ffc-e769-4460-996d-134e5eed2153"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.401731 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-scripts" (OuterVolumeSpecName: "scripts") pod "3a5e6ffc-e769-4460-996d-134e5eed2153" (UID: "3a5e6ffc-e769-4460-996d-134e5eed2153"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.402667 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a5e6ffc-e769-4460-996d-134e5eed2153-kube-api-access-xfcdc" (OuterVolumeSpecName: "kube-api-access-xfcdc") pod "3a5e6ffc-e769-4460-996d-134e5eed2153" (UID: "3a5e6ffc-e769-4460-996d-134e5eed2153"). InnerVolumeSpecName "kube-api-access-xfcdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.441418 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3a5e6ffc-e769-4460-996d-134e5eed2153" (UID: "3a5e6ffc-e769-4460-996d-134e5eed2153"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.486270 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a5e6ffc-e769-4460-996d-134e5eed2153" (UID: "3a5e6ffc-e769-4460-996d-134e5eed2153"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.494629 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.494665 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfcdc\" (UniqueName: \"kubernetes.io/projected/3a5e6ffc-e769-4460-996d-134e5eed2153-kube-api-access-xfcdc\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.494677 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5e6ffc-e769-4460-996d-134e5eed2153-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.494684 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5e6ffc-e769-4460-996d-134e5eed2153-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.494692 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.494700 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.511791 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-config-data" (OuterVolumeSpecName: "config-data") pod "3a5e6ffc-e769-4460-996d-134e5eed2153" (UID: "3a5e6ffc-e769-4460-996d-134e5eed2153"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:15 crc kubenswrapper[4756]: I0318 14:23:15.596662 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5e6ffc-e769-4460-996d-134e5eed2153-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.252600 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5e6ffc-e769-4460-996d-134e5eed2153","Type":"ContainerDied","Data":"5087fb1586f86a0f2037d33cb3cb00af939b836477a9d451be08920bef8ff937"} Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.254087 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.256483 4756 scope.go:117] "RemoveContainer" containerID="8aa5aa1fcf6f2bd1c77b59cf7a52f09eab06920799784686e29caba830b0e25d" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.256763 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.256785 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.256797 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.256806 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.346807 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.368195 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.371393 4756 scope.go:117] "RemoveContainer" containerID="df97eeb0f03931abdbb133fc9f65ab7f44c07133450b942465efeb4da3117bf9" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.386195 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:16 crc kubenswrapper[4756]: E0318 14:23:16.386805 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="ceilometer-notification-agent" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.386837 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="ceilometer-notification-agent" Mar 18 14:23:16 crc kubenswrapper[4756]: E0318 14:23:16.386856 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="sg-core" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.386867 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="sg-core" Mar 18 14:23:16 crc kubenswrapper[4756]: E0318 14:23:16.386889 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="ceilometer-central-agent" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.386897 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="ceilometer-central-agent" Mar 18 14:23:16 crc kubenswrapper[4756]: E0318 14:23:16.386918 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="proxy-httpd" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.386928 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="proxy-httpd" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.387268 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="ceilometer-notification-agent" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.387296 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="ceilometer-central-agent" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.387316 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="sg-core" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.387330 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" containerName="proxy-httpd" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.393307 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.395709 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.395994 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.411003 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.430108 4756 scope.go:117] "RemoveContainer" containerID="dbb17aa335803875ca16acd83ac40593e6577d6053cd4c4ef7422810579383d6" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.466206 4756 scope.go:117] "RemoveContainer" containerID="d16ba12a9c73df48af38ac83cd1d37e3f4f175c56831674025bd9fb8bcea932e" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.521224 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-scripts\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.521294 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b745493-807a-450b-9ab9-56d0d86fcfe4-run-httpd\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.521335 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.521359 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.521528 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-config-data\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.521698 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz4h5\" (UniqueName: \"kubernetes.io/projected/1b745493-807a-450b-9ab9-56d0d86fcfe4-kube-api-access-zz4h5\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.521950 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b745493-807a-450b-9ab9-56d0d86fcfe4-log-httpd\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.623342 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b745493-807a-450b-9ab9-56d0d86fcfe4-log-httpd\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.623403 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-scripts\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.623439 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b745493-807a-450b-9ab9-56d0d86fcfe4-run-httpd\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.623467 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.623483 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.623507 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-config-data\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.623559 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz4h5\" (UniqueName: \"kubernetes.io/projected/1b745493-807a-450b-9ab9-56d0d86fcfe4-kube-api-access-zz4h5\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.624230 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b745493-807a-450b-9ab9-56d0d86fcfe4-log-httpd\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.626397 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b745493-807a-450b-9ab9-56d0d86fcfe4-run-httpd\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.630320 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.631223 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.633796 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-scripts\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.634168 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-config-data\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.646866 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz4h5\" (UniqueName: \"kubernetes.io/projected/1b745493-807a-450b-9ab9-56d0d86fcfe4-kube-api-access-zz4h5\") pod \"ceilometer-0\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " pod="openstack/ceilometer-0" Mar 18 14:23:16 crc kubenswrapper[4756]: I0318 14:23:16.731795 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:17 crc kubenswrapper[4756]: I0318 14:23:17.358757 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a5e6ffc-e769-4460-996d-134e5eed2153" path="/var/lib/kubelet/pods/3a5e6ffc-e769-4460-996d-134e5eed2153/volumes" Mar 18 14:23:17 crc kubenswrapper[4756]: I0318 14:23:17.456411 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:17 crc kubenswrapper[4756]: I0318 14:23:17.472521 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 14:23:18 crc kubenswrapper[4756]: I0318 14:23:18.298219 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 14:23:18 crc kubenswrapper[4756]: I0318 14:23:18.298504 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 14:23:18 crc kubenswrapper[4756]: I0318 14:23:18.299494 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b745493-807a-450b-9ab9-56d0d86fcfe4","Type":"ContainerStarted","Data":"b6c31f895129d562350afa6a0859b79ab6032ec9a4c5144ec833647accdf2bc6"} Mar 18 14:23:18 crc kubenswrapper[4756]: I0318 14:23:18.299569 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 14:23:18 crc kubenswrapper[4756]: I0318 14:23:18.299585 4756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 14:23:19 crc kubenswrapper[4756]: I0318 14:23:19.002684 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 14:23:19 crc kubenswrapper[4756]: I0318 14:23:19.010101 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 14:23:19 crc kubenswrapper[4756]: I0318 14:23:19.023928 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Mar 18 14:23:19 crc kubenswrapper[4756]: I0318 14:23:19.058218 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 14:23:19 crc kubenswrapper[4756]: I0318 14:23:19.084801 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 14:23:19 crc kubenswrapper[4756]: I0318 14:23:19.344908 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b745493-807a-450b-9ab9-56d0d86fcfe4","Type":"ContainerStarted","Data":"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6"} Mar 18 14:23:19 crc kubenswrapper[4756]: I0318 14:23:19.344954 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b745493-807a-450b-9ab9-56d0d86fcfe4","Type":"ContainerStarted","Data":"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb"} Mar 18 14:23:19 crc kubenswrapper[4756]: I0318 14:23:19.884515 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:20 crc kubenswrapper[4756]: I0318 14:23:20.175820 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:23:20 crc kubenswrapper[4756]: I0318 14:23:20.399093 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b745493-807a-450b-9ab9-56d0d86fcfe4","Type":"ContainerStarted","Data":"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a"} Mar 18 14:23:20 crc kubenswrapper[4756]: I0318 14:23:20.636383 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-654d496b7d-zrbn7" Mar 18 14:23:20 crc kubenswrapper[4756]: I0318 14:23:20.714670 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5f97847bb-hrplw"] Mar 18 14:23:20 crc kubenswrapper[4756]: I0318 14:23:20.714951 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5f97847bb-hrplw" podUID="8d50193a-fddc-4dc4-a597-96d67e28a55b" containerName="placement-log" containerID="cri-o://bde8356d73eeed484d3c8d6dc02ba8f18da4576e7e28c66218ae5f6e3fbe5608" gracePeriod=30 Mar 18 14:23:20 crc kubenswrapper[4756]: I0318 14:23:20.715383 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5f97847bb-hrplw" podUID="8d50193a-fddc-4dc4-a597-96d67e28a55b" containerName="placement-api" containerID="cri-o://d64f2b0282847e55d050121638c79416baaf546c7b5340972c9b5c6e4f8e49eb" gracePeriod=30 Mar 18 14:23:21 crc kubenswrapper[4756]: I0318 14:23:21.409702 4756 generic.go:334] "Generic (PLEG): container finished" podID="8d50193a-fddc-4dc4-a597-96d67e28a55b" containerID="bde8356d73eeed484d3c8d6dc02ba8f18da4576e7e28c66218ae5f6e3fbe5608" exitCode=143 Mar 18 14:23:21 crc kubenswrapper[4756]: I0318 14:23:21.409786 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f97847bb-hrplw" event={"ID":"8d50193a-fddc-4dc4-a597-96d67e28a55b","Type":"ContainerDied","Data":"bde8356d73eeed484d3c8d6dc02ba8f18da4576e7e28c66218ae5f6e3fbe5608"} Mar 18 14:23:24 crc kubenswrapper[4756]: I0318 14:23:24.445033 4756 generic.go:334] "Generic (PLEG): container finished" podID="8d50193a-fddc-4dc4-a597-96d67e28a55b" containerID="d64f2b0282847e55d050121638c79416baaf546c7b5340972c9b5c6e4f8e49eb" exitCode=0 Mar 18 14:23:24 crc kubenswrapper[4756]: I0318 14:23:24.445145 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f97847bb-hrplw" event={"ID":"8d50193a-fddc-4dc4-a597-96d67e28a55b","Type":"ContainerDied","Data":"d64f2b0282847e55d050121638c79416baaf546c7b5340972c9b5c6e4f8e49eb"} Mar 18 14:23:26 crc kubenswrapper[4756]: I0318 14:23:26.957914 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.073816 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-config-data\") pod \"8d50193a-fddc-4dc4-a597-96d67e28a55b\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.073875 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-public-tls-certs\") pod \"8d50193a-fddc-4dc4-a597-96d67e28a55b\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.073903 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-scripts\") pod \"8d50193a-fddc-4dc4-a597-96d67e28a55b\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.074018 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d50193a-fddc-4dc4-a597-96d67e28a55b-logs\") pod \"8d50193a-fddc-4dc4-a597-96d67e28a55b\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.074522 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d50193a-fddc-4dc4-a597-96d67e28a55b-logs" (OuterVolumeSpecName: "logs") pod "8d50193a-fddc-4dc4-a597-96d67e28a55b" (UID: "8d50193a-fddc-4dc4-a597-96d67e28a55b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.074635 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-combined-ca-bundle\") pod \"8d50193a-fddc-4dc4-a597-96d67e28a55b\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.074981 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-internal-tls-certs\") pod \"8d50193a-fddc-4dc4-a597-96d67e28a55b\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.075017 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-659zn\" (UniqueName: \"kubernetes.io/projected/8d50193a-fddc-4dc4-a597-96d67e28a55b-kube-api-access-659zn\") pod \"8d50193a-fddc-4dc4-a597-96d67e28a55b\" (UID: \"8d50193a-fddc-4dc4-a597-96d67e28a55b\") " Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.078303 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d50193a-fddc-4dc4-a597-96d67e28a55b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.080381 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-scripts" (OuterVolumeSpecName: "scripts") pod "8d50193a-fddc-4dc4-a597-96d67e28a55b" (UID: "8d50193a-fddc-4dc4-a597-96d67e28a55b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.080779 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d50193a-fddc-4dc4-a597-96d67e28a55b-kube-api-access-659zn" (OuterVolumeSpecName: "kube-api-access-659zn") pod "8d50193a-fddc-4dc4-a597-96d67e28a55b" (UID: "8d50193a-fddc-4dc4-a597-96d67e28a55b"). InnerVolumeSpecName "kube-api-access-659zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.141383 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-config-data" (OuterVolumeSpecName: "config-data") pod "8d50193a-fddc-4dc4-a597-96d67e28a55b" (UID: "8d50193a-fddc-4dc4-a597-96d67e28a55b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.147379 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d50193a-fddc-4dc4-a597-96d67e28a55b" (UID: "8d50193a-fddc-4dc4-a597-96d67e28a55b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.180429 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.180459 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-659zn\" (UniqueName: \"kubernetes.io/projected/8d50193a-fddc-4dc4-a597-96d67e28a55b-kube-api-access-659zn\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.180470 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.180479 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.201082 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8d50193a-fddc-4dc4-a597-96d67e28a55b" (UID: "8d50193a-fddc-4dc4-a597-96d67e28a55b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.217279 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8d50193a-fddc-4dc4-a597-96d67e28a55b" (UID: "8d50193a-fddc-4dc4-a597-96d67e28a55b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.282990 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.283029 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d50193a-fddc-4dc4-a597-96d67e28a55b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.487564 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k4zq4" event={"ID":"73e64ca2-cc29-4a74-a970-b127ec0380f1","Type":"ContainerStarted","Data":"3d358fd515b011708016827117bac120bc582a21947c91d875a22ff1b818a612"} Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.491883 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b745493-807a-450b-9ab9-56d0d86fcfe4","Type":"ContainerStarted","Data":"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea"} Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.491979 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="ceilometer-central-agent" containerID="cri-o://3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb" gracePeriod=30 Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.492009 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="sg-core" containerID="cri-o://6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a" gracePeriod=30 Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.492045 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.492063 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="proxy-httpd" containerID="cri-o://185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea" gracePeriod=30 Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.492099 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="ceilometer-notification-agent" containerID="cri-o://328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6" gracePeriod=30 Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.505710 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f97847bb-hrplw" event={"ID":"8d50193a-fddc-4dc4-a597-96d67e28a55b","Type":"ContainerDied","Data":"64358c4b728140d8efc9e280dce700abf9ed2f863c7c7045c50100b4bb1fae9e"} Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.505793 4756 scope.go:117] "RemoveContainer" containerID="d64f2b0282847e55d050121638c79416baaf546c7b5340972c9b5c6e4f8e49eb" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.505973 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f97847bb-hrplw" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.507253 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-k4zq4" podStartSLOduration=1.666429089 podStartE2EDuration="13.506907753s" podCreationTimestamp="2026-03-18 14:23:14 +0000 UTC" firstStartedPulling="2026-03-18 14:23:14.979376182 +0000 UTC m=+1396.293794157" lastFinishedPulling="2026-03-18 14:23:26.819854836 +0000 UTC m=+1408.134272821" observedRunningTime="2026-03-18 14:23:27.501908059 +0000 UTC m=+1408.816326034" watchObservedRunningTime="2026-03-18 14:23:27.506907753 +0000 UTC m=+1408.821325738" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.532114 4756 scope.go:117] "RemoveContainer" containerID="bde8356d73eeed484d3c8d6dc02ba8f18da4576e7e28c66218ae5f6e3fbe5608" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.541696 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.217456486 podStartE2EDuration="11.541673394s" podCreationTimestamp="2026-03-18 14:23:16 +0000 UTC" firstStartedPulling="2026-03-18 14:23:17.424213437 +0000 UTC m=+1398.738631412" lastFinishedPulling="2026-03-18 14:23:26.748430345 +0000 UTC m=+1408.062848320" observedRunningTime="2026-03-18 14:23:27.523668806 +0000 UTC m=+1408.838086781" watchObservedRunningTime="2026-03-18 14:23:27.541673394 +0000 UTC m=+1408.856091369" Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.557956 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5f97847bb-hrplw"] Mar 18 14:23:27 crc kubenswrapper[4756]: I0318 14:23:27.566910 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5f97847bb-hrplw"] Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.321846 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.405882 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b745493-807a-450b-9ab9-56d0d86fcfe4-log-httpd\") pod \"1b745493-807a-450b-9ab9-56d0d86fcfe4\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.405932 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-config-data\") pod \"1b745493-807a-450b-9ab9-56d0d86fcfe4\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.405949 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-sg-core-conf-yaml\") pod \"1b745493-807a-450b-9ab9-56d0d86fcfe4\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.406048 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b745493-807a-450b-9ab9-56d0d86fcfe4-run-httpd\") pod \"1b745493-807a-450b-9ab9-56d0d86fcfe4\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.406065 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-scripts\") pod \"1b745493-807a-450b-9ab9-56d0d86fcfe4\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.406157 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz4h5\" (UniqueName: \"kubernetes.io/projected/1b745493-807a-450b-9ab9-56d0d86fcfe4-kube-api-access-zz4h5\") pod \"1b745493-807a-450b-9ab9-56d0d86fcfe4\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.406234 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-combined-ca-bundle\") pod \"1b745493-807a-450b-9ab9-56d0d86fcfe4\" (UID: \"1b745493-807a-450b-9ab9-56d0d86fcfe4\") " Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.406409 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b745493-807a-450b-9ab9-56d0d86fcfe4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1b745493-807a-450b-9ab9-56d0d86fcfe4" (UID: "1b745493-807a-450b-9ab9-56d0d86fcfe4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.406585 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b745493-807a-450b-9ab9-56d0d86fcfe4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1b745493-807a-450b-9ab9-56d0d86fcfe4" (UID: "1b745493-807a-450b-9ab9-56d0d86fcfe4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.406689 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b745493-807a-450b-9ab9-56d0d86fcfe4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.406701 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b745493-807a-450b-9ab9-56d0d86fcfe4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.430842 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b745493-807a-450b-9ab9-56d0d86fcfe4-kube-api-access-zz4h5" (OuterVolumeSpecName: "kube-api-access-zz4h5") pod "1b745493-807a-450b-9ab9-56d0d86fcfe4" (UID: "1b745493-807a-450b-9ab9-56d0d86fcfe4"). InnerVolumeSpecName "kube-api-access-zz4h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.432251 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-scripts" (OuterVolumeSpecName: "scripts") pod "1b745493-807a-450b-9ab9-56d0d86fcfe4" (UID: "1b745493-807a-450b-9ab9-56d0d86fcfe4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.444748 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1b745493-807a-450b-9ab9-56d0d86fcfe4" (UID: "1b745493-807a-450b-9ab9-56d0d86fcfe4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.502799 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b745493-807a-450b-9ab9-56d0d86fcfe4" (UID: "1b745493-807a-450b-9ab9-56d0d86fcfe4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.508117 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.508162 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.508173 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.508182 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz4h5\" (UniqueName: \"kubernetes.io/projected/1b745493-807a-450b-9ab9-56d0d86fcfe4-kube-api-access-zz4h5\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.524442 4756 generic.go:334] "Generic (PLEG): container finished" podID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerID="185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea" exitCode=0 Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.524472 4756 generic.go:334] "Generic (PLEG): container finished" podID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerID="6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a" exitCode=2 Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.524480 4756 generic.go:334] "Generic (PLEG): container finished" podID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerID="328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6" exitCode=0 Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.524489 4756 generic.go:334] "Generic (PLEG): container finished" podID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerID="3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb" exitCode=0 Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.525448 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.525919 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b745493-807a-450b-9ab9-56d0d86fcfe4","Type":"ContainerDied","Data":"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea"} Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.526010 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b745493-807a-450b-9ab9-56d0d86fcfe4","Type":"ContainerDied","Data":"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a"} Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.526046 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b745493-807a-450b-9ab9-56d0d86fcfe4","Type":"ContainerDied","Data":"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6"} Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.526060 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b745493-807a-450b-9ab9-56d0d86fcfe4","Type":"ContainerDied","Data":"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb"} Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.526072 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b745493-807a-450b-9ab9-56d0d86fcfe4","Type":"ContainerDied","Data":"b6c31f895129d562350afa6a0859b79ab6032ec9a4c5144ec833647accdf2bc6"} Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.526100 4756 scope.go:117] "RemoveContainer" containerID="185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.556773 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-config-data" (OuterVolumeSpecName: "config-data") pod "1b745493-807a-450b-9ab9-56d0d86fcfe4" (UID: "1b745493-807a-450b-9ab9-56d0d86fcfe4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.557407 4756 scope.go:117] "RemoveContainer" containerID="6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.585261 4756 scope.go:117] "RemoveContainer" containerID="328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.610064 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b745493-807a-450b-9ab9-56d0d86fcfe4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.611493 4756 scope.go:117] "RemoveContainer" containerID="3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.639658 4756 scope.go:117] "RemoveContainer" containerID="185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea" Mar 18 14:23:28 crc kubenswrapper[4756]: E0318 14:23:28.640291 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea\": container with ID starting with 185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea not found: ID does not exist" containerID="185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.640360 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea"} err="failed to get container status \"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea\": rpc error: code = NotFound desc = could not find container \"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea\": container with ID starting with 185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.640388 4756 scope.go:117] "RemoveContainer" containerID="6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a" Mar 18 14:23:28 crc kubenswrapper[4756]: E0318 14:23:28.640875 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a\": container with ID starting with 6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a not found: ID does not exist" containerID="6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.640914 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a"} err="failed to get container status \"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a\": rpc error: code = NotFound desc = could not find container \"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a\": container with ID starting with 6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.640935 4756 scope.go:117] "RemoveContainer" containerID="328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6" Mar 18 14:23:28 crc kubenswrapper[4756]: E0318 14:23:28.641538 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6\": container with ID starting with 328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6 not found: ID does not exist" containerID="328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.641585 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6"} err="failed to get container status \"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6\": rpc error: code = NotFound desc = could not find container \"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6\": container with ID starting with 328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6 not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.641617 4756 scope.go:117] "RemoveContainer" containerID="3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb" Mar 18 14:23:28 crc kubenswrapper[4756]: E0318 14:23:28.641986 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb\": container with ID starting with 3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb not found: ID does not exist" containerID="3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.642016 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb"} err="failed to get container status \"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb\": rpc error: code = NotFound desc = could not find container \"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb\": container with ID starting with 3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.642033 4756 scope.go:117] "RemoveContainer" containerID="185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.642370 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea"} err="failed to get container status \"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea\": rpc error: code = NotFound desc = could not find container \"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea\": container with ID starting with 185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.642413 4756 scope.go:117] "RemoveContainer" containerID="6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.642880 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a"} err="failed to get container status \"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a\": rpc error: code = NotFound desc = could not find container \"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a\": container with ID starting with 6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.642907 4756 scope.go:117] "RemoveContainer" containerID="328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.643161 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6"} err="failed to get container status \"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6\": rpc error: code = NotFound desc = could not find container \"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6\": container with ID starting with 328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6 not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.643185 4756 scope.go:117] "RemoveContainer" containerID="3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.643531 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb"} err="failed to get container status \"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb\": rpc error: code = NotFound desc = could not find container \"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb\": container with ID starting with 3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.643554 4756 scope.go:117] "RemoveContainer" containerID="185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.643809 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea"} err="failed to get container status \"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea\": rpc error: code = NotFound desc = could not find container \"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea\": container with ID starting with 185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.643836 4756 scope.go:117] "RemoveContainer" containerID="6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.644213 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a"} err="failed to get container status \"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a\": rpc error: code = NotFound desc = could not find container \"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a\": container with ID starting with 6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.644238 4756 scope.go:117] "RemoveContainer" containerID="328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.644529 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6"} err="failed to get container status \"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6\": rpc error: code = NotFound desc = could not find container \"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6\": container with ID starting with 328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6 not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.644559 4756 scope.go:117] "RemoveContainer" containerID="3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.644883 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb"} err="failed to get container status \"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb\": rpc error: code = NotFound desc = could not find container \"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb\": container with ID starting with 3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.644905 4756 scope.go:117] "RemoveContainer" containerID="185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.645236 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea"} err="failed to get container status \"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea\": rpc error: code = NotFound desc = could not find container \"185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea\": container with ID starting with 185aae4a640e78345d318dc8fc65c226a9130621adcc0501180f20b4f318c9ea not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.645261 4756 scope.go:117] "RemoveContainer" containerID="6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.645612 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a"} err="failed to get container status \"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a\": rpc error: code = NotFound desc = could not find container \"6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a\": container with ID starting with 6cabde29a60a1d9c71666eadf079cc94652e43f04bba7b46e5b052c8ad424a8a not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.645633 4756 scope.go:117] "RemoveContainer" containerID="328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.646048 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6"} err="failed to get container status \"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6\": rpc error: code = NotFound desc = could not find container \"328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6\": container with ID starting with 328d0b922348a64faa51131941effedb36b55ad6b02a34e2e44d99fe2da7c8e6 not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.646081 4756 scope.go:117] "RemoveContainer" containerID="3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.646646 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb"} err="failed to get container status \"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb\": rpc error: code = NotFound desc = could not find container \"3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb\": container with ID starting with 3722600af0c64b3f1f8d6f94e3c98d7d83dbc171486d22ff352e1c7b50eb91eb not found: ID does not exist" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.883244 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.898143 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.909211 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:28 crc kubenswrapper[4756]: E0318 14:23:28.909628 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="ceilometer-notification-agent" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.909646 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="ceilometer-notification-agent" Mar 18 14:23:28 crc kubenswrapper[4756]: E0318 14:23:28.909656 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="proxy-httpd" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.909662 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="proxy-httpd" Mar 18 14:23:28 crc kubenswrapper[4756]: E0318 14:23:28.909675 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d50193a-fddc-4dc4-a597-96d67e28a55b" containerName="placement-log" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.909682 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d50193a-fddc-4dc4-a597-96d67e28a55b" containerName="placement-log" Mar 18 14:23:28 crc kubenswrapper[4756]: E0318 14:23:28.909693 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="ceilometer-central-agent" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.909698 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="ceilometer-central-agent" Mar 18 14:23:28 crc kubenswrapper[4756]: E0318 14:23:28.909736 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="sg-core" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.909741 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="sg-core" Mar 18 14:23:28 crc kubenswrapper[4756]: E0318 14:23:28.909755 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d50193a-fddc-4dc4-a597-96d67e28a55b" containerName="placement-api" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.909760 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d50193a-fddc-4dc4-a597-96d67e28a55b" containerName="placement-api" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.909940 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="sg-core" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.909949 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d50193a-fddc-4dc4-a597-96d67e28a55b" containerName="placement-api" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.909961 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="proxy-httpd" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.909972 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d50193a-fddc-4dc4-a597-96d67e28a55b" containerName="placement-log" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.909984 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="ceilometer-central-agent" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.910001 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" containerName="ceilometer-notification-agent" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.911764 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.913834 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.914248 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 14:23:28 crc kubenswrapper[4756]: I0318 14:23:28.922547 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.018792 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f28d65f-de6b-4458-a802-960f138e1296-run-httpd\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.018841 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-scripts\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.018868 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f28d65f-de6b-4458-a802-960f138e1296-log-httpd\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.018889 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.018906 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8kwk\" (UniqueName: \"kubernetes.io/projected/6f28d65f-de6b-4458-a802-960f138e1296-kube-api-access-n8kwk\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.018950 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.019034 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-config-data\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.120548 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-config-data\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.120662 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f28d65f-de6b-4458-a802-960f138e1296-run-httpd\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.120701 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-scripts\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.120723 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f28d65f-de6b-4458-a802-960f138e1296-log-httpd\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.120750 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.120776 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8kwk\" (UniqueName: \"kubernetes.io/projected/6f28d65f-de6b-4458-a802-960f138e1296-kube-api-access-n8kwk\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.120835 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.121202 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f28d65f-de6b-4458-a802-960f138e1296-run-httpd\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.121483 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f28d65f-de6b-4458-a802-960f138e1296-log-httpd\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.125173 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-scripts\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.125644 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.125956 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.127159 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-config-data\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.139298 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8kwk\" (UniqueName: \"kubernetes.io/projected/6f28d65f-de6b-4458-a802-960f138e1296-kube-api-access-n8kwk\") pod \"ceilometer-0\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.242583 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.328077 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b745493-807a-450b-9ab9-56d0d86fcfe4" path="/var/lib/kubelet/pods/1b745493-807a-450b-9ab9-56d0d86fcfe4/volumes" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.328811 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d50193a-fddc-4dc4-a597-96d67e28a55b" path="/var/lib/kubelet/pods/8d50193a-fddc-4dc4-a597-96d67e28a55b/volumes" Mar 18 14:23:29 crc kubenswrapper[4756]: I0318 14:23:29.708749 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:29 crc kubenswrapper[4756]: W0318 14:23:29.729887 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f28d65f_de6b_4458_a802_960f138e1296.slice/crio-29a072a87a0913d56bda2bdbb83e997ff294bc1ab44d47c0937748f54faf210e WatchSource:0}: Error finding container 29a072a87a0913d56bda2bdbb83e997ff294bc1ab44d47c0937748f54faf210e: Status 404 returned error can't find the container with id 29a072a87a0913d56bda2bdbb83e997ff294bc1ab44d47c0937748f54faf210e Mar 18 14:23:30 crc kubenswrapper[4756]: I0318 14:23:30.547377 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f28d65f-de6b-4458-a802-960f138e1296","Type":"ContainerStarted","Data":"bf099ea40a2a7c1c719f24ec8c26b35a5dbba2abbec7685301ac851230201ba0"} Mar 18 14:23:30 crc kubenswrapper[4756]: I0318 14:23:30.547719 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f28d65f-de6b-4458-a802-960f138e1296","Type":"ContainerStarted","Data":"29a072a87a0913d56bda2bdbb83e997ff294bc1ab44d47c0937748f54faf210e"} Mar 18 14:23:32 crc kubenswrapper[4756]: I0318 14:23:32.568928 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f28d65f-de6b-4458-a802-960f138e1296","Type":"ContainerStarted","Data":"9c5fdb80bdcbeb2b9515aecc045a967ec7883c2dcab70a02f7cf443903a05c96"} Mar 18 14:23:33 crc kubenswrapper[4756]: I0318 14:23:33.580948 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f28d65f-de6b-4458-a802-960f138e1296","Type":"ContainerStarted","Data":"dda648dd6b6883e8702d7bdb38df17ec51f4a405838cb186b095bcc0ea77974f"} Mar 18 14:23:35 crc kubenswrapper[4756]: I0318 14:23:35.599210 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f28d65f-de6b-4458-a802-960f138e1296","Type":"ContainerStarted","Data":"5f104ccee538bf2ff6ff371cebc8a8a30b6075ccd843ca8008949e17e046188b"} Mar 18 14:23:35 crc kubenswrapper[4756]: I0318 14:23:35.599850 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 14:23:35 crc kubenswrapper[4756]: I0318 14:23:35.628898 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.496062388 podStartE2EDuration="7.628878874s" podCreationTimestamp="2026-03-18 14:23:28 +0000 UTC" firstStartedPulling="2026-03-18 14:23:29.732488011 +0000 UTC m=+1411.046905986" lastFinishedPulling="2026-03-18 14:23:34.865304497 +0000 UTC m=+1416.179722472" observedRunningTime="2026-03-18 14:23:35.619777587 +0000 UTC m=+1416.934195562" watchObservedRunningTime="2026-03-18 14:23:35.628878874 +0000 UTC m=+1416.943296849" Mar 18 14:23:39 crc kubenswrapper[4756]: I0318 14:23:39.633865 4756 generic.go:334] "Generic (PLEG): container finished" podID="73e64ca2-cc29-4a74-a970-b127ec0380f1" containerID="3d358fd515b011708016827117bac120bc582a21947c91d875a22ff1b818a612" exitCode=0 Mar 18 14:23:39 crc kubenswrapper[4756]: I0318 14:23:39.633962 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k4zq4" event={"ID":"73e64ca2-cc29-4a74-a970-b127ec0380f1","Type":"ContainerDied","Data":"3d358fd515b011708016827117bac120bc582a21947c91d875a22ff1b818a612"} Mar 18 14:23:40 crc kubenswrapper[4756]: I0318 14:23:40.219691 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:40 crc kubenswrapper[4756]: I0318 14:23:40.220265 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="ceilometer-central-agent" containerID="cri-o://bf099ea40a2a7c1c719f24ec8c26b35a5dbba2abbec7685301ac851230201ba0" gracePeriod=30 Mar 18 14:23:40 crc kubenswrapper[4756]: I0318 14:23:40.220393 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="proxy-httpd" containerID="cri-o://5f104ccee538bf2ff6ff371cebc8a8a30b6075ccd843ca8008949e17e046188b" gracePeriod=30 Mar 18 14:23:40 crc kubenswrapper[4756]: I0318 14:23:40.220345 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="ceilometer-notification-agent" containerID="cri-o://9c5fdb80bdcbeb2b9515aecc045a967ec7883c2dcab70a02f7cf443903a05c96" gracePeriod=30 Mar 18 14:23:40 crc kubenswrapper[4756]: I0318 14:23:40.220664 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="sg-core" containerID="cri-o://dda648dd6b6883e8702d7bdb38df17ec51f4a405838cb186b095bcc0ea77974f" gracePeriod=30 Mar 18 14:23:40 crc kubenswrapper[4756]: I0318 14:23:40.655412 4756 generic.go:334] "Generic (PLEG): container finished" podID="6f28d65f-de6b-4458-a802-960f138e1296" containerID="5f104ccee538bf2ff6ff371cebc8a8a30b6075ccd843ca8008949e17e046188b" exitCode=0 Mar 18 14:23:40 crc kubenswrapper[4756]: I0318 14:23:40.655703 4756 generic.go:334] "Generic (PLEG): container finished" podID="6f28d65f-de6b-4458-a802-960f138e1296" containerID="dda648dd6b6883e8702d7bdb38df17ec51f4a405838cb186b095bcc0ea77974f" exitCode=2 Mar 18 14:23:40 crc kubenswrapper[4756]: I0318 14:23:40.655716 4756 generic.go:334] "Generic (PLEG): container finished" podID="6f28d65f-de6b-4458-a802-960f138e1296" containerID="9c5fdb80bdcbeb2b9515aecc045a967ec7883c2dcab70a02f7cf443903a05c96" exitCode=0 Mar 18 14:23:40 crc kubenswrapper[4756]: I0318 14:23:40.655480 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f28d65f-de6b-4458-a802-960f138e1296","Type":"ContainerDied","Data":"5f104ccee538bf2ff6ff371cebc8a8a30b6075ccd843ca8008949e17e046188b"} Mar 18 14:23:40 crc kubenswrapper[4756]: I0318 14:23:40.655801 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f28d65f-de6b-4458-a802-960f138e1296","Type":"ContainerDied","Data":"dda648dd6b6883e8702d7bdb38df17ec51f4a405838cb186b095bcc0ea77974f"} Mar 18 14:23:40 crc kubenswrapper[4756]: I0318 14:23:40.655811 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f28d65f-de6b-4458-a802-960f138e1296","Type":"ContainerDied","Data":"9c5fdb80bdcbeb2b9515aecc045a967ec7883c2dcab70a02f7cf443903a05c96"} Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.072244 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.163325 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlwhn\" (UniqueName: \"kubernetes.io/projected/73e64ca2-cc29-4a74-a970-b127ec0380f1-kube-api-access-qlwhn\") pod \"73e64ca2-cc29-4a74-a970-b127ec0380f1\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.163453 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-scripts\") pod \"73e64ca2-cc29-4a74-a970-b127ec0380f1\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.163520 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-config-data\") pod \"73e64ca2-cc29-4a74-a970-b127ec0380f1\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.163548 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-combined-ca-bundle\") pod \"73e64ca2-cc29-4a74-a970-b127ec0380f1\" (UID: \"73e64ca2-cc29-4a74-a970-b127ec0380f1\") " Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.168910 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73e64ca2-cc29-4a74-a970-b127ec0380f1-kube-api-access-qlwhn" (OuterVolumeSpecName: "kube-api-access-qlwhn") pod "73e64ca2-cc29-4a74-a970-b127ec0380f1" (UID: "73e64ca2-cc29-4a74-a970-b127ec0380f1"). InnerVolumeSpecName "kube-api-access-qlwhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.169464 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-scripts" (OuterVolumeSpecName: "scripts") pod "73e64ca2-cc29-4a74-a970-b127ec0380f1" (UID: "73e64ca2-cc29-4a74-a970-b127ec0380f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.193793 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-config-data" (OuterVolumeSpecName: "config-data") pod "73e64ca2-cc29-4a74-a970-b127ec0380f1" (UID: "73e64ca2-cc29-4a74-a970-b127ec0380f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.193871 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73e64ca2-cc29-4a74-a970-b127ec0380f1" (UID: "73e64ca2-cc29-4a74-a970-b127ec0380f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.266491 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.266536 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.266550 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e64ca2-cc29-4a74-a970-b127ec0380f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.266566 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlwhn\" (UniqueName: \"kubernetes.io/projected/73e64ca2-cc29-4a74-a970-b127ec0380f1-kube-api-access-qlwhn\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.667004 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k4zq4" event={"ID":"73e64ca2-cc29-4a74-a970-b127ec0380f1","Type":"ContainerDied","Data":"98f0fba6f39b3644919a984424daeffd603203588b23500a0b97310782e431f8"} Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.667052 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98f0fba6f39b3644919a984424daeffd603203588b23500a0b97310782e431f8" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.667129 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k4zq4" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.673954 4756 generic.go:334] "Generic (PLEG): container finished" podID="6f28d65f-de6b-4458-a802-960f138e1296" containerID="bf099ea40a2a7c1c719f24ec8c26b35a5dbba2abbec7685301ac851230201ba0" exitCode=0 Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.673995 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f28d65f-de6b-4458-a802-960f138e1296","Type":"ContainerDied","Data":"bf099ea40a2a7c1c719f24ec8c26b35a5dbba2abbec7685301ac851230201ba0"} Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.741270 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 14:23:41 crc kubenswrapper[4756]: E0318 14:23:41.742425 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e64ca2-cc29-4a74-a970-b127ec0380f1" containerName="nova-cell0-conductor-db-sync" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.747425 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e64ca2-cc29-4a74-a970-b127ec0380f1" containerName="nova-cell0-conductor-db-sync" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.747878 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e64ca2-cc29-4a74-a970-b127ec0380f1" containerName="nova-cell0-conductor-db-sync" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.748759 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.752398 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zxp4v" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.752723 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.762867 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.876593 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwppg\" (UniqueName: \"kubernetes.io/projected/eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3-kube-api-access-jwppg\") pod \"nova-cell0-conductor-0\" (UID: \"eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.876952 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.876978 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.978485 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.978541 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.978655 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwppg\" (UniqueName: \"kubernetes.io/projected/eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3-kube-api-access-jwppg\") pod \"nova-cell0-conductor-0\" (UID: \"eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.982666 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.984675 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 14:23:41 crc kubenswrapper[4756]: I0318 14:23:41.998567 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwppg\" (UniqueName: \"kubernetes.io/projected/eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3-kube-api-access-jwppg\") pod \"nova-cell0-conductor-0\" (UID: \"eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.071552 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.152781 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.287049 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-config-data\") pod \"6f28d65f-de6b-4458-a802-960f138e1296\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.287821 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-combined-ca-bundle\") pod \"6f28d65f-de6b-4458-a802-960f138e1296\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.287920 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-scripts\") pod \"6f28d65f-de6b-4458-a802-960f138e1296\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.288055 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f28d65f-de6b-4458-a802-960f138e1296-log-httpd\") pod \"6f28d65f-de6b-4458-a802-960f138e1296\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.288253 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-sg-core-conf-yaml\") pod \"6f28d65f-de6b-4458-a802-960f138e1296\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.288472 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8kwk\" (UniqueName: \"kubernetes.io/projected/6f28d65f-de6b-4458-a802-960f138e1296-kube-api-access-n8kwk\") pod \"6f28d65f-de6b-4458-a802-960f138e1296\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.288586 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f28d65f-de6b-4458-a802-960f138e1296-run-httpd\") pod \"6f28d65f-de6b-4458-a802-960f138e1296\" (UID: \"6f28d65f-de6b-4458-a802-960f138e1296\") " Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.300416 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f28d65f-de6b-4458-a802-960f138e1296-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6f28d65f-de6b-4458-a802-960f138e1296" (UID: "6f28d65f-de6b-4458-a802-960f138e1296"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.300957 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f28d65f-de6b-4458-a802-960f138e1296-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6f28d65f-de6b-4458-a802-960f138e1296" (UID: "6f28d65f-de6b-4458-a802-960f138e1296"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.368966 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f28d65f-de6b-4458-a802-960f138e1296-kube-api-access-n8kwk" (OuterVolumeSpecName: "kube-api-access-n8kwk") pod "6f28d65f-de6b-4458-a802-960f138e1296" (UID: "6f28d65f-de6b-4458-a802-960f138e1296"). InnerVolumeSpecName "kube-api-access-n8kwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.369517 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-scripts" (OuterVolumeSpecName: "scripts") pod "6f28d65f-de6b-4458-a802-960f138e1296" (UID: "6f28d65f-de6b-4458-a802-960f138e1296"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.451861 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6f28d65f-de6b-4458-a802-960f138e1296" (UID: "6f28d65f-de6b-4458-a802-960f138e1296"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.454469 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8kwk\" (UniqueName: \"kubernetes.io/projected/6f28d65f-de6b-4458-a802-960f138e1296-kube-api-access-n8kwk\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.454499 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f28d65f-de6b-4458-a802-960f138e1296-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.454508 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.454517 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f28d65f-de6b-4458-a802-960f138e1296-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.454525 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.739269 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f28d65f-de6b-4458-a802-960f138e1296" (UID: "6f28d65f-de6b-4458-a802-960f138e1296"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.761496 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.762884 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f28d65f-de6b-4458-a802-960f138e1296","Type":"ContainerDied","Data":"29a072a87a0913d56bda2bdbb83e997ff294bc1ab44d47c0937748f54faf210e"} Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.762932 4756 scope.go:117] "RemoveContainer" containerID="5f104ccee538bf2ff6ff371cebc8a8a30b6075ccd843ca8008949e17e046188b" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.763084 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.849322 4756 scope.go:117] "RemoveContainer" containerID="dda648dd6b6883e8702d7bdb38df17ec51f4a405838cb186b095bcc0ea77974f" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.897541 4756 scope.go:117] "RemoveContainer" containerID="9c5fdb80bdcbeb2b9515aecc045a967ec7883c2dcab70a02f7cf443903a05c96" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.938706 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-config-data" (OuterVolumeSpecName: "config-data") pod "6f28d65f-de6b-4458-a802-960f138e1296" (UID: "6f28d65f-de6b-4458-a802-960f138e1296"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.965597 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f28d65f-de6b-4458-a802-960f138e1296-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:23:42 crc kubenswrapper[4756]: I0318 14:23:42.970511 4756 scope.go:117] "RemoveContainer" containerID="bf099ea40a2a7c1c719f24ec8c26b35a5dbba2abbec7685301ac851230201ba0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.023612 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.128284 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.149074 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.196934 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:43 crc kubenswrapper[4756]: E0318 14:23:43.197627 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="ceilometer-notification-agent" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.197737 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="ceilometer-notification-agent" Mar 18 14:23:43 crc kubenswrapper[4756]: E0318 14:23:43.197825 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="proxy-httpd" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.197906 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="proxy-httpd" Mar 18 14:23:43 crc kubenswrapper[4756]: E0318 14:23:43.197985 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="sg-core" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.198054 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="sg-core" Mar 18 14:23:43 crc kubenswrapper[4756]: E0318 14:23:43.198159 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="ceilometer-central-agent" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.198236 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="ceilometer-central-agent" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.198527 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="sg-core" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.198763 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="proxy-httpd" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.198850 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="ceilometer-central-agent" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.198943 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f28d65f-de6b-4458-a802-960f138e1296" containerName="ceilometer-notification-agent" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.209379 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.213250 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.219736 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.242320 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.324905 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f28d65f-de6b-4458-a802-960f138e1296" path="/var/lib/kubelet/pods/6f28d65f-de6b-4458-a802-960f138e1296/volumes" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.372459 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-config-data\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.372739 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vpnd\" (UniqueName: \"kubernetes.io/projected/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-kube-api-access-4vpnd\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.372866 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-scripts\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.373005 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.373250 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-run-httpd\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.373904 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-log-httpd\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.374058 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.475990 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.476378 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-config-data\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.476514 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vpnd\" (UniqueName: \"kubernetes.io/projected/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-kube-api-access-4vpnd\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.476643 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-scripts\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.476784 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.476888 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-run-httpd\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.477008 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-log-httpd\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.477521 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-log-httpd\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.478539 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-run-httpd\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.481848 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.482682 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-scripts\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.482972 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-config-data\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.484883 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.501024 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vpnd\" (UniqueName: \"kubernetes.io/projected/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-kube-api-access-4vpnd\") pod \"ceilometer-0\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.553850 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.804250 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3","Type":"ContainerStarted","Data":"08029769a9e53b229d72024a7bd93db529a09d1cb66fc3b372999f989d749c86"} Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.804292 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3","Type":"ContainerStarted","Data":"6c2f596b775eac2177cf321de54c49c6f7138aa1a05ee380a3b78055a4eadac4"} Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.805276 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 14:23:43 crc kubenswrapper[4756]: I0318 14:23:43.823621 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.823603162 podStartE2EDuration="2.823603162s" podCreationTimestamp="2026-03-18 14:23:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:23:43.823359155 +0000 UTC m=+1425.137777130" watchObservedRunningTime="2026-03-18 14:23:43.823603162 +0000 UTC m=+1425.138021137" Mar 18 14:23:44 crc kubenswrapper[4756]: I0318 14:23:44.014972 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:23:44 crc kubenswrapper[4756]: I0318 14:23:44.813552 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2296fb-bd22-4c18-9ecb-7c6ad419b336","Type":"ContainerStarted","Data":"3bc13efaf539ce4c0c688e4a291261c3e82f4f0ba635d8228a20bf872c546483"} Mar 18 14:23:44 crc kubenswrapper[4756]: I0318 14:23:44.813807 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2296fb-bd22-4c18-9ecb-7c6ad419b336","Type":"ContainerStarted","Data":"9f339225384eb4e80c7aa8e8e45597ddc9748a273e29aab180b69de3775c6c5b"} Mar 18 14:23:45 crc kubenswrapper[4756]: I0318 14:23:45.858592 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2296fb-bd22-4c18-9ecb-7c6ad419b336","Type":"ContainerStarted","Data":"bcb153a107e360625e52455bf535271b34b8025a3f6bdfd4728ed1247eea38dd"} Mar 18 14:23:46 crc kubenswrapper[4756]: I0318 14:23:46.871208 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2296fb-bd22-4c18-9ecb-7c6ad419b336","Type":"ContainerStarted","Data":"0b042a4e40d2d56323798913402c426b205d14c767658556b71b2514944bb5bf"} Mar 18 14:23:49 crc kubenswrapper[4756]: I0318 14:23:49.897156 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2296fb-bd22-4c18-9ecb-7c6ad419b336","Type":"ContainerStarted","Data":"ce1864ce85c999a6a2848ddb0d2b55c29fd72f911b56e5e1717f60858e300f42"} Mar 18 14:23:49 crc kubenswrapper[4756]: I0318 14:23:49.897702 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 14:23:49 crc kubenswrapper[4756]: I0318 14:23:49.919178 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.705667942 podStartE2EDuration="6.91916158s" podCreationTimestamp="2026-03-18 14:23:43 +0000 UTC" firstStartedPulling="2026-03-18 14:23:44.016027935 +0000 UTC m=+1425.330445910" lastFinishedPulling="2026-03-18 14:23:49.229521573 +0000 UTC m=+1430.543939548" observedRunningTime="2026-03-18 14:23:49.916695443 +0000 UTC m=+1431.231113418" watchObservedRunningTime="2026-03-18 14:23:49.91916158 +0000 UTC m=+1431.233579555" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.118024 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.562936 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-t85xw"] Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.564277 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.566131 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.566394 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.577348 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-t85xw"] Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.747395 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.748900 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.751262 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.760504 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-config-data\") pod \"nova-cell0-cell-mapping-t85xw\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.760553 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2m5v\" (UniqueName: \"kubernetes.io/projected/61189734-2130-4de6-aa34-c96879ff64de-kube-api-access-q2m5v\") pod \"nova-cell0-cell-mapping-t85xw\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.760707 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-t85xw\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.760759 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-scripts\") pod \"nova-cell0-cell-mapping-t85xw\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.774029 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.778446 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.785456 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.806370 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.825357 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.826575 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.829210 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.855804 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.863369 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-scripts\") pod \"nova-cell0-cell-mapping-t85xw\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.863412 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-config-data\") pod \"nova-cell0-cell-mapping-t85xw\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.863440 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2m5v\" (UniqueName: \"kubernetes.io/projected/61189734-2130-4de6-aa34-c96879ff64de-kube-api-access-q2m5v\") pod \"nova-cell0-cell-mapping-t85xw\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.863525 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb14feaf-f9db-4eee-aa10-a2da68219dfc-config-data\") pod \"nova-api-0\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " pod="openstack/nova-api-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.863594 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb14feaf-f9db-4eee-aa10-a2da68219dfc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " pod="openstack/nova-api-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.863616 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69rwd\" (UniqueName: \"kubernetes.io/projected/cb14feaf-f9db-4eee-aa10-a2da68219dfc-kube-api-access-69rwd\") pod \"nova-api-0\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " pod="openstack/nova-api-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.863668 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb14feaf-f9db-4eee-aa10-a2da68219dfc-logs\") pod \"nova-api-0\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " pod="openstack/nova-api-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.863687 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-t85xw\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.876762 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-scripts\") pod \"nova-cell0-cell-mapping-t85xw\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.882558 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-t85xw\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.886372 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-config-data\") pod \"nova-cell0-cell-mapping-t85xw\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.895472 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2m5v\" (UniqueName: \"kubernetes.io/projected/61189734-2130-4de6-aa34-c96879ff64de-kube-api-access-q2m5v\") pod \"nova-cell0-cell-mapping-t85xw\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.908049 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.961089 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.962848 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.964873 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.968284 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df05fbc-2aca-4054-9b21-b0bfe7436077-config-data\") pod \"nova-scheduler-0\" (UID: \"9df05fbc-2aca-4054-9b21-b0bfe7436077\") " pod="openstack/nova-scheduler-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.968363 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.968426 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb14feaf-f9db-4eee-aa10-a2da68219dfc-logs\") pod \"nova-api-0\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " pod="openstack/nova-api-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.968471 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nbv9\" (UniqueName: \"kubernetes.io/projected/9df05fbc-2aca-4054-9b21-b0bfe7436077-kube-api-access-9nbv9\") pod \"nova-scheduler-0\" (UID: \"9df05fbc-2aca-4054-9b21-b0bfe7436077\") " pod="openstack/nova-scheduler-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.968523 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq9tr\" (UniqueName: \"kubernetes.io/projected/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-kube-api-access-zq9tr\") pod \"nova-cell1-novncproxy-0\" (UID: \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.968594 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df05fbc-2aca-4054-9b21-b0bfe7436077-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9df05fbc-2aca-4054-9b21-b0bfe7436077\") " pod="openstack/nova-scheduler-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.968669 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb14feaf-f9db-4eee-aa10-a2da68219dfc-config-data\") pod \"nova-api-0\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " pod="openstack/nova-api-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.968732 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.968759 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb14feaf-f9db-4eee-aa10-a2da68219dfc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " pod="openstack/nova-api-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.968785 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69rwd\" (UniqueName: \"kubernetes.io/projected/cb14feaf-f9db-4eee-aa10-a2da68219dfc-kube-api-access-69rwd\") pod \"nova-api-0\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " pod="openstack/nova-api-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.969344 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb14feaf-f9db-4eee-aa10-a2da68219dfc-logs\") pod \"nova-api-0\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " pod="openstack/nova-api-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.995967 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb14feaf-f9db-4eee-aa10-a2da68219dfc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " pod="openstack/nova-api-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.996936 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb14feaf-f9db-4eee-aa10-a2da68219dfc-config-data\") pod \"nova-api-0\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " pod="openstack/nova-api-0" Mar 18 14:23:52 crc kubenswrapper[4756]: I0318 14:23:52.996971 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.004611 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69rwd\" (UniqueName: \"kubernetes.io/projected/cb14feaf-f9db-4eee-aa10-a2da68219dfc-kube-api-access-69rwd\") pod \"nova-api-0\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " pod="openstack/nova-api-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.019187 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cd565959-ns4cg"] Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.020939 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.045682 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-ns4cg"] Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.078085 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.119199 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-dns-svc\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.119257 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-config\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.119300 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klzp5\" (UniqueName: \"kubernetes.io/projected/95f072df-7757-4f69-a289-b75f17b122ad-kube-api-access-klzp5\") pod \"nova-metadata-0\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " pod="openstack/nova-metadata-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.122209 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.122265 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f072df-7757-4f69-a289-b75f17b122ad-config-data\") pod \"nova-metadata-0\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " pod="openstack/nova-metadata-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.122321 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f072df-7757-4f69-a289-b75f17b122ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " pod="openstack/nova-metadata-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.122363 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df05fbc-2aca-4054-9b21-b0bfe7436077-config-data\") pod \"nova-scheduler-0\" (UID: \"9df05fbc-2aca-4054-9b21-b0bfe7436077\") " pod="openstack/nova-scheduler-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.122394 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.122443 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.122483 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.122529 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nbv9\" (UniqueName: \"kubernetes.io/projected/9df05fbc-2aca-4054-9b21-b0bfe7436077-kube-api-access-9nbv9\") pod \"nova-scheduler-0\" (UID: \"9df05fbc-2aca-4054-9b21-b0bfe7436077\") " pod="openstack/nova-scheduler-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.122577 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq9tr\" (UniqueName: \"kubernetes.io/projected/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-kube-api-access-zq9tr\") pod \"nova-cell1-novncproxy-0\" (UID: \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.122609 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvczs\" (UniqueName: \"kubernetes.io/projected/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-kube-api-access-rvczs\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.122628 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95f072df-7757-4f69-a289-b75f17b122ad-logs\") pod \"nova-metadata-0\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " pod="openstack/nova-metadata-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.122688 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.122716 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df05fbc-2aca-4054-9b21-b0bfe7436077-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9df05fbc-2aca-4054-9b21-b0bfe7436077\") " pod="openstack/nova-scheduler-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.128746 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.131653 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df05fbc-2aca-4054-9b21-b0bfe7436077-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9df05fbc-2aca-4054-9b21-b0bfe7436077\") " pod="openstack/nova-scheduler-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.150342 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df05fbc-2aca-4054-9b21-b0bfe7436077-config-data\") pod \"nova-scheduler-0\" (UID: \"9df05fbc-2aca-4054-9b21-b0bfe7436077\") " pod="openstack/nova-scheduler-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.153956 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq9tr\" (UniqueName: \"kubernetes.io/projected/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-kube-api-access-zq9tr\") pod \"nova-cell1-novncproxy-0\" (UID: \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.154568 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.175110 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nbv9\" (UniqueName: \"kubernetes.io/projected/9df05fbc-2aca-4054-9b21-b0bfe7436077-kube-api-access-9nbv9\") pod \"nova-scheduler-0\" (UID: \"9df05fbc-2aca-4054-9b21-b0bfe7436077\") " pod="openstack/nova-scheduler-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.181603 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.225065 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-dns-svc\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.225192 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-config\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.225264 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klzp5\" (UniqueName: \"kubernetes.io/projected/95f072df-7757-4f69-a289-b75f17b122ad-kube-api-access-klzp5\") pod \"nova-metadata-0\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " pod="openstack/nova-metadata-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.225294 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f072df-7757-4f69-a289-b75f17b122ad-config-data\") pod \"nova-metadata-0\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " pod="openstack/nova-metadata-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.225431 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f072df-7757-4f69-a289-b75f17b122ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " pod="openstack/nova-metadata-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.225484 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.225510 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.225551 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvczs\" (UniqueName: \"kubernetes.io/projected/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-kube-api-access-rvczs\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.225567 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95f072df-7757-4f69-a289-b75f17b122ad-logs\") pod \"nova-metadata-0\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " pod="openstack/nova-metadata-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.225601 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.226291 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-config\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.226771 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.226979 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-dns-svc\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.227893 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.228157 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95f072df-7757-4f69-a289-b75f17b122ad-logs\") pod \"nova-metadata-0\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " pod="openstack/nova-metadata-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.228350 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.243696 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvczs\" (UniqueName: \"kubernetes.io/projected/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-kube-api-access-rvczs\") pod \"dnsmasq-dns-78cd565959-ns4cg\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.243981 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f072df-7757-4f69-a289-b75f17b122ad-config-data\") pod \"nova-metadata-0\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " pod="openstack/nova-metadata-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.244246 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klzp5\" (UniqueName: \"kubernetes.io/projected/95f072df-7757-4f69-a289-b75f17b122ad-kube-api-access-klzp5\") pod \"nova-metadata-0\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " pod="openstack/nova-metadata-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.246140 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f072df-7757-4f69-a289-b75f17b122ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " pod="openstack/nova-metadata-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.392433 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.406771 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.421473 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.457312 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.899159 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b64xk"] Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.904974 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.923680 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.923792 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.953736 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b64xk"] Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.955687 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b64xk\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.955800 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-config-data\") pod \"nova-cell1-conductor-db-sync-b64xk\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.955834 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-scripts\") pod \"nova-cell1-conductor-db-sync-b64xk\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.955883 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxzlr\" (UniqueName: \"kubernetes.io/projected/a8af5145-9647-4eb0-95f2-4e32d681fc9e-kube-api-access-mxzlr\") pod \"nova-cell1-conductor-db-sync-b64xk\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:53 crc kubenswrapper[4756]: I0318 14:23:53.971219 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.043918 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-t85xw"] Mar 18 14:23:54 crc kubenswrapper[4756]: W0318 14:23:54.063905 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61189734_2130_4de6_aa34_c96879ff64de.slice/crio-f9beba57b30f5c4a841d76c15b368010d38a8270a4756c3ac1e18dfc387dfe3a WatchSource:0}: Error finding container f9beba57b30f5c4a841d76c15b368010d38a8270a4756c3ac1e18dfc387dfe3a: Status 404 returned error can't find the container with id f9beba57b30f5c4a841d76c15b368010d38a8270a4756c3ac1e18dfc387dfe3a Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.066774 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxzlr\" (UniqueName: \"kubernetes.io/projected/a8af5145-9647-4eb0-95f2-4e32d681fc9e-kube-api-access-mxzlr\") pod \"nova-cell1-conductor-db-sync-b64xk\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.066963 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b64xk\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.067071 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-config-data\") pod \"nova-cell1-conductor-db-sync-b64xk\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.067101 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-scripts\") pod \"nova-cell1-conductor-db-sync-b64xk\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.076712 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-config-data\") pod \"nova-cell1-conductor-db-sync-b64xk\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.086671 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b64xk\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.087348 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-scripts\") pod \"nova-cell1-conductor-db-sync-b64xk\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.099612 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxzlr\" (UniqueName: \"kubernetes.io/projected/a8af5145-9647-4eb0-95f2-4e32d681fc9e-kube-api-access-mxzlr\") pod \"nova-cell1-conductor-db-sync-b64xk\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.311900 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.561523 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.611591 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-ns4cg"] Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.985960 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-t85xw" event={"ID":"61189734-2130-4de6-aa34-c96879ff64de","Type":"ContainerStarted","Data":"3d7edb2d29f4b86e999900f63630402ea7f227c196f2b8799d8d520d08f29895"} Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.986331 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-t85xw" event={"ID":"61189734-2130-4de6-aa34-c96879ff64de","Type":"ContainerStarted","Data":"f9beba57b30f5c4a841d76c15b368010d38a8270a4756c3ac1e18dfc387dfe3a"} Mar 18 14:23:54 crc kubenswrapper[4756]: I0318 14:23:54.993700 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95f072df-7757-4f69-a289-b75f17b122ad","Type":"ContainerStarted","Data":"dc3b1bc5ac592590dac1e325c3982d3831d5874e39861b94f0c1918f7636a309"} Mar 18 14:23:55 crc kubenswrapper[4756]: I0318 14:23:55.003237 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb14feaf-f9db-4eee-aa10-a2da68219dfc","Type":"ContainerStarted","Data":"2fd656fa1b10ca64430c30aa0f24c15c9f7ce4dd91333a0a940d49fad84185df"} Mar 18 14:23:55 crc kubenswrapper[4756]: I0318 14:23:55.030004 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-t85xw" podStartSLOduration=3.029983211 podStartE2EDuration="3.029983211s" podCreationTimestamp="2026-03-18 14:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:23:55.017241777 +0000 UTC m=+1436.331659762" watchObservedRunningTime="2026-03-18 14:23:55.029983211 +0000 UTC m=+1436.344401186" Mar 18 14:23:55 crc kubenswrapper[4756]: I0318 14:23:55.031421 4756 generic.go:334] "Generic (PLEG): container finished" podID="dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" containerID="0566d276433c3b7f728bf701e3b74cae247edc93d6211bacbb0ec1ac3f66f28e" exitCode=0 Mar 18 14:23:55 crc kubenswrapper[4756]: I0318 14:23:55.031454 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-ns4cg" event={"ID":"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a","Type":"ContainerDied","Data":"0566d276433c3b7f728bf701e3b74cae247edc93d6211bacbb0ec1ac3f66f28e"} Mar 18 14:23:55 crc kubenswrapper[4756]: I0318 14:23:55.031478 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-ns4cg" event={"ID":"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a","Type":"ContainerStarted","Data":"271b33457d9b61f32faa1893f63b97cadcb8ab9fdf1eb59649145f1db533fdf9"} Mar 18 14:23:55 crc kubenswrapper[4756]: W0318 14:23:55.077645 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb97ce503_7791_4a3c_b6b4_2dbfaccc94a1.slice/crio-7fb95c7c1905a3539ce190068f7cd75f8fb7120054d6c0e02e6eab295464012c WatchSource:0}: Error finding container 7fb95c7c1905a3539ce190068f7cd75f8fb7120054d6c0e02e6eab295464012c: Status 404 returned error can't find the container with id 7fb95c7c1905a3539ce190068f7cd75f8fb7120054d6c0e02e6eab295464012c Mar 18 14:23:55 crc kubenswrapper[4756]: I0318 14:23:55.081966 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 14:23:55 crc kubenswrapper[4756]: I0318 14:23:55.097897 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:23:55 crc kubenswrapper[4756]: I0318 14:23:55.312639 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b64xk"] Mar 18 14:23:56 crc kubenswrapper[4756]: I0318 14:23:56.050088 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-ns4cg" event={"ID":"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a","Type":"ContainerStarted","Data":"c6348806d3190a6708fd6c47a6e91096b9409a436f6d9a0896df2b66fad90a12"} Mar 18 14:23:56 crc kubenswrapper[4756]: I0318 14:23:56.052063 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:23:56 crc kubenswrapper[4756]: I0318 14:23:56.055181 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b64xk" event={"ID":"a8af5145-9647-4eb0-95f2-4e32d681fc9e","Type":"ContainerStarted","Data":"f1203b8dbbf22be7e38417c9cf29ba809735cba1c0199aaa108f2daa976be915"} Mar 18 14:23:56 crc kubenswrapper[4756]: I0318 14:23:56.055220 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b64xk" event={"ID":"a8af5145-9647-4eb0-95f2-4e32d681fc9e","Type":"ContainerStarted","Data":"742e76e16df40d40bc9030aad7316bb8a31dc6f2d9b3a76d39e584f04c875351"} Mar 18 14:23:56 crc kubenswrapper[4756]: I0318 14:23:56.067085 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9df05fbc-2aca-4054-9b21-b0bfe7436077","Type":"ContainerStarted","Data":"761d3781e56c9c3d63e661d829a9c638f19123b7a94fa5ec4976188362e6dbb7"} Mar 18 14:23:56 crc kubenswrapper[4756]: I0318 14:23:56.073761 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cd565959-ns4cg" podStartSLOduration=4.073745854 podStartE2EDuration="4.073745854s" podCreationTimestamp="2026-03-18 14:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:23:56.071233596 +0000 UTC m=+1437.385651591" watchObservedRunningTime="2026-03-18 14:23:56.073745854 +0000 UTC m=+1437.388163829" Mar 18 14:23:56 crc kubenswrapper[4756]: I0318 14:23:56.077291 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1","Type":"ContainerStarted","Data":"7fb95c7c1905a3539ce190068f7cd75f8fb7120054d6c0e02e6eab295464012c"} Mar 18 14:23:56 crc kubenswrapper[4756]: I0318 14:23:56.108397 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-b64xk" podStartSLOduration=3.10837306 podStartE2EDuration="3.10837306s" podCreationTimestamp="2026-03-18 14:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:23:56.089641034 +0000 UTC m=+1437.404059009" watchObservedRunningTime="2026-03-18 14:23:56.10837306 +0000 UTC m=+1437.422791035" Mar 18 14:23:56 crc kubenswrapper[4756]: I0318 14:23:56.542384 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:23:56 crc kubenswrapper[4756]: I0318 14:23:56.555103 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 14:23:59 crc kubenswrapper[4756]: I0318 14:23:59.108976 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1","Type":"ContainerStarted","Data":"920a6c13e95d8c98dca0033b3081f9edb413d45f4ac0725ae17af9b5665cff48"} Mar 18 14:23:59 crc kubenswrapper[4756]: I0318 14:23:59.109337 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b97ce503-7791-4a3c-b6b4-2dbfaccc94a1" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://920a6c13e95d8c98dca0033b3081f9edb413d45f4ac0725ae17af9b5665cff48" gracePeriod=30 Mar 18 14:23:59 crc kubenswrapper[4756]: I0318 14:23:59.111577 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95f072df-7757-4f69-a289-b75f17b122ad","Type":"ContainerStarted","Data":"139dd49e881b7cb92a14d47495e0b4995a8a93c1b3d1d83d36bd8f33055088a2"} Mar 18 14:23:59 crc kubenswrapper[4756]: I0318 14:23:59.111613 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95f072df-7757-4f69-a289-b75f17b122ad","Type":"ContainerStarted","Data":"5cc3858f0e33434acfab966cdbe709bf219600e383f27ba0bccf560cc6cc9592"} Mar 18 14:23:59 crc kubenswrapper[4756]: I0318 14:23:59.111727 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="95f072df-7757-4f69-a289-b75f17b122ad" containerName="nova-metadata-log" containerID="cri-o://5cc3858f0e33434acfab966cdbe709bf219600e383f27ba0bccf560cc6cc9592" gracePeriod=30 Mar 18 14:23:59 crc kubenswrapper[4756]: I0318 14:23:59.111805 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="95f072df-7757-4f69-a289-b75f17b122ad" containerName="nova-metadata-metadata" containerID="cri-o://139dd49e881b7cb92a14d47495e0b4995a8a93c1b3d1d83d36bd8f33055088a2" gracePeriod=30 Mar 18 14:23:59 crc kubenswrapper[4756]: I0318 14:23:59.114913 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb14feaf-f9db-4eee-aa10-a2da68219dfc","Type":"ContainerStarted","Data":"a5cfbab009eb2f3b7a8d269c82aafdc187e061cb0c0e608b421eca6640c3776a"} Mar 18 14:23:59 crc kubenswrapper[4756]: I0318 14:23:59.114942 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb14feaf-f9db-4eee-aa10-a2da68219dfc","Type":"ContainerStarted","Data":"efe3ba966e239c2c2dcdaccd3a1599c0b46d5397106a4588fc828960a6aa24b0"} Mar 18 14:23:59 crc kubenswrapper[4756]: I0318 14:23:59.119397 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9df05fbc-2aca-4054-9b21-b0bfe7436077","Type":"ContainerStarted","Data":"0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece"} Mar 18 14:23:59 crc kubenswrapper[4756]: I0318 14:23:59.139352 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.8626021440000002 podStartE2EDuration="7.139322504s" podCreationTimestamp="2026-03-18 14:23:52 +0000 UTC" firstStartedPulling="2026-03-18 14:23:55.088682958 +0000 UTC m=+1436.403100933" lastFinishedPulling="2026-03-18 14:23:58.365403318 +0000 UTC m=+1439.679821293" observedRunningTime="2026-03-18 14:23:59.125014257 +0000 UTC m=+1440.439432222" watchObservedRunningTime="2026-03-18 14:23:59.139322504 +0000 UTC m=+1440.453740479" Mar 18 14:23:59 crc kubenswrapper[4756]: I0318 14:23:59.175778 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.905058093 podStartE2EDuration="7.175755309s" podCreationTimestamp="2026-03-18 14:23:52 +0000 UTC" firstStartedPulling="2026-03-18 14:23:55.076246453 +0000 UTC m=+1436.390664428" lastFinishedPulling="2026-03-18 14:23:58.346943669 +0000 UTC m=+1439.661361644" observedRunningTime="2026-03-18 14:23:59.165683237 +0000 UTC m=+1440.480101212" watchObservedRunningTime="2026-03-18 14:23:59.175755309 +0000 UTC m=+1440.490173284" Mar 18 14:23:59 crc kubenswrapper[4756]: I0318 14:23:59.216545 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.843137 podStartE2EDuration="7.216523672s" podCreationTimestamp="2026-03-18 14:23:52 +0000 UTC" firstStartedPulling="2026-03-18 14:23:53.972275952 +0000 UTC m=+1435.286693927" lastFinishedPulling="2026-03-18 14:23:58.345662624 +0000 UTC m=+1439.660080599" observedRunningTime="2026-03-18 14:23:59.202152583 +0000 UTC m=+1440.516570558" watchObservedRunningTime="2026-03-18 14:23:59.216523672 +0000 UTC m=+1440.530941637" Mar 18 14:23:59 crc kubenswrapper[4756]: I0318 14:23:59.231275 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.449838574 podStartE2EDuration="7.23125484s" podCreationTimestamp="2026-03-18 14:23:52 +0000 UTC" firstStartedPulling="2026-03-18 14:23:54.566802618 +0000 UTC m=+1435.881220593" lastFinishedPulling="2026-03-18 14:23:58.348218884 +0000 UTC m=+1439.662636859" observedRunningTime="2026-03-18 14:23:59.229286186 +0000 UTC m=+1440.543704161" watchObservedRunningTime="2026-03-18 14:23:59.23125484 +0000 UTC m=+1440.545672815" Mar 18 14:24:00 crc kubenswrapper[4756]: I0318 14:24:00.128543 4756 generic.go:334] "Generic (PLEG): container finished" podID="95f072df-7757-4f69-a289-b75f17b122ad" containerID="5cc3858f0e33434acfab966cdbe709bf219600e383f27ba0bccf560cc6cc9592" exitCode=143 Mar 18 14:24:00 crc kubenswrapper[4756]: I0318 14:24:00.128733 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95f072df-7757-4f69-a289-b75f17b122ad","Type":"ContainerDied","Data":"5cc3858f0e33434acfab966cdbe709bf219600e383f27ba0bccf560cc6cc9592"} Mar 18 14:24:00 crc kubenswrapper[4756]: I0318 14:24:00.139533 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564064-wt6kw"] Mar 18 14:24:00 crc kubenswrapper[4756]: I0318 14:24:00.140913 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564064-wt6kw" Mar 18 14:24:00 crc kubenswrapper[4756]: I0318 14:24:00.147071 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:24:00 crc kubenswrapper[4756]: I0318 14:24:00.147261 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:24:00 crc kubenswrapper[4756]: I0318 14:24:00.147538 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:24:00 crc kubenswrapper[4756]: I0318 14:24:00.151150 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564064-wt6kw"] Mar 18 14:24:00 crc kubenswrapper[4756]: I0318 14:24:00.207255 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs8ft\" (UniqueName: \"kubernetes.io/projected/7958dad1-8aea-4918-be5f-9cfd91a229a9-kube-api-access-bs8ft\") pod \"auto-csr-approver-29564064-wt6kw\" (UID: \"7958dad1-8aea-4918-be5f-9cfd91a229a9\") " pod="openshift-infra/auto-csr-approver-29564064-wt6kw" Mar 18 14:24:00 crc kubenswrapper[4756]: I0318 14:24:00.309523 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs8ft\" (UniqueName: \"kubernetes.io/projected/7958dad1-8aea-4918-be5f-9cfd91a229a9-kube-api-access-bs8ft\") pod \"auto-csr-approver-29564064-wt6kw\" (UID: \"7958dad1-8aea-4918-be5f-9cfd91a229a9\") " pod="openshift-infra/auto-csr-approver-29564064-wt6kw" Mar 18 14:24:00 crc kubenswrapper[4756]: I0318 14:24:00.326900 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs8ft\" (UniqueName: \"kubernetes.io/projected/7958dad1-8aea-4918-be5f-9cfd91a229a9-kube-api-access-bs8ft\") pod \"auto-csr-approver-29564064-wt6kw\" (UID: \"7958dad1-8aea-4918-be5f-9cfd91a229a9\") " pod="openshift-infra/auto-csr-approver-29564064-wt6kw" Mar 18 14:24:00 crc kubenswrapper[4756]: I0318 14:24:00.457555 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564064-wt6kw" Mar 18 14:24:00 crc kubenswrapper[4756]: I0318 14:24:00.976156 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564064-wt6kw"] Mar 18 14:24:01 crc kubenswrapper[4756]: I0318 14:24:01.138650 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564064-wt6kw" event={"ID":"7958dad1-8aea-4918-be5f-9cfd91a229a9","Type":"ContainerStarted","Data":"c5ff04b4d473e389b7d521db5a6f306e781b76dc3317071e54a2bb1904db43c1"} Mar 18 14:24:03 crc kubenswrapper[4756]: I0318 14:24:03.082047 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 14:24:03 crc kubenswrapper[4756]: I0318 14:24:03.083416 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 14:24:03 crc kubenswrapper[4756]: I0318 14:24:03.159251 4756 generic.go:334] "Generic (PLEG): container finished" podID="7958dad1-8aea-4918-be5f-9cfd91a229a9" containerID="5727aa64c617c359d928b7d123101e3575db23ec6cea09c351ce25f4dee49f3e" exitCode=0 Mar 18 14:24:03 crc kubenswrapper[4756]: I0318 14:24:03.159353 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564064-wt6kw" event={"ID":"7958dad1-8aea-4918-be5f-9cfd91a229a9","Type":"ContainerDied","Data":"5727aa64c617c359d928b7d123101e3575db23ec6cea09c351ce25f4dee49f3e"} Mar 18 14:24:03 crc kubenswrapper[4756]: I0318 14:24:03.163323 4756 generic.go:334] "Generic (PLEG): container finished" podID="61189734-2130-4de6-aa34-c96879ff64de" containerID="3d7edb2d29f4b86e999900f63630402ea7f227c196f2b8799d8d520d08f29895" exitCode=0 Mar 18 14:24:03 crc kubenswrapper[4756]: I0318 14:24:03.163377 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-t85xw" event={"ID":"61189734-2130-4de6-aa34-c96879ff64de","Type":"ContainerDied","Data":"3d7edb2d29f4b86e999900f63630402ea7f227c196f2b8799d8d520d08f29895"} Mar 18 14:24:03 crc kubenswrapper[4756]: I0318 14:24:03.409307 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:24:03 crc kubenswrapper[4756]: I0318 14:24:03.422470 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:03 crc kubenswrapper[4756]: I0318 14:24:03.458176 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 14:24:03 crc kubenswrapper[4756]: I0318 14:24:03.458759 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 14:24:03 crc kubenswrapper[4756]: I0318 14:24:03.495273 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-vbm9s"] Mar 18 14:24:03 crc kubenswrapper[4756]: I0318 14:24:03.495486 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" podUID="d9909000-8e99-44f7-9f35-570757a60e59" containerName="dnsmasq-dns" containerID="cri-o://5d9d11cf69fb28b9cd08dc8e149bc3293ca7150751413dc8440cd248ace5a13b" gracePeriod=10 Mar 18 14:24:03 crc kubenswrapper[4756]: I0318 14:24:03.523223 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.165351 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cb14feaf-f9db-4eee-aa10-a2da68219dfc" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.219:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.165380 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cb14feaf-f9db-4eee-aa10-a2da68219dfc" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.219:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.191493 4756 generic.go:334] "Generic (PLEG): container finished" podID="d9909000-8e99-44f7-9f35-570757a60e59" containerID="5d9d11cf69fb28b9cd08dc8e149bc3293ca7150751413dc8440cd248ace5a13b" exitCode=0 Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.191571 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" event={"ID":"d9909000-8e99-44f7-9f35-570757a60e59","Type":"ContainerDied","Data":"5d9d11cf69fb28b9cd08dc8e149bc3293ca7150751413dc8440cd248ace5a13b"} Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.193614 4756 generic.go:334] "Generic (PLEG): container finished" podID="a8af5145-9647-4eb0-95f2-4e32d681fc9e" containerID="f1203b8dbbf22be7e38417c9cf29ba809735cba1c0199aaa108f2daa976be915" exitCode=0 Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.193854 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b64xk" event={"ID":"a8af5145-9647-4eb0-95f2-4e32d681fc9e","Type":"ContainerDied","Data":"f1203b8dbbf22be7e38417c9cf29ba809735cba1c0199aaa108f2daa976be915"} Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.256708 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.337795 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.426379 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-ovsdbserver-sb\") pod \"d9909000-8e99-44f7-9f35-570757a60e59\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.426437 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-config\") pod \"d9909000-8e99-44f7-9f35-570757a60e59\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.426558 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sld7\" (UniqueName: \"kubernetes.io/projected/d9909000-8e99-44f7-9f35-570757a60e59-kube-api-access-5sld7\") pod \"d9909000-8e99-44f7-9f35-570757a60e59\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.426592 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-ovsdbserver-nb\") pod \"d9909000-8e99-44f7-9f35-570757a60e59\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.426655 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-dns-svc\") pod \"d9909000-8e99-44f7-9f35-570757a60e59\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.426714 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-dns-swift-storage-0\") pod \"d9909000-8e99-44f7-9f35-570757a60e59\" (UID: \"d9909000-8e99-44f7-9f35-570757a60e59\") " Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.432943 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9909000-8e99-44f7-9f35-570757a60e59-kube-api-access-5sld7" (OuterVolumeSpecName: "kube-api-access-5sld7") pod "d9909000-8e99-44f7-9f35-570757a60e59" (UID: "d9909000-8e99-44f7-9f35-570757a60e59"). InnerVolumeSpecName "kube-api-access-5sld7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.515805 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d9909000-8e99-44f7-9f35-570757a60e59" (UID: "d9909000-8e99-44f7-9f35-570757a60e59"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.528793 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d9909000-8e99-44f7-9f35-570757a60e59" (UID: "d9909000-8e99-44f7-9f35-570757a60e59"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.533617 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.533650 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.533662 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sld7\" (UniqueName: \"kubernetes.io/projected/d9909000-8e99-44f7-9f35-570757a60e59-kube-api-access-5sld7\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.543242 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-config" (OuterVolumeSpecName: "config") pod "d9909000-8e99-44f7-9f35-570757a60e59" (UID: "d9909000-8e99-44f7-9f35-570757a60e59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.558658 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d9909000-8e99-44f7-9f35-570757a60e59" (UID: "d9909000-8e99-44f7-9f35-570757a60e59"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.577633 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d9909000-8e99-44f7-9f35-570757a60e59" (UID: "d9909000-8e99-44f7-9f35-570757a60e59"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.635456 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.635485 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:04 crc kubenswrapper[4756]: I0318 14:24:04.635496 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9909000-8e99-44f7-9f35-570757a60e59-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.086271 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564064-wt6kw" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.092836 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.204874 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-t85xw" event={"ID":"61189734-2130-4de6-aa34-c96879ff64de","Type":"ContainerDied","Data":"f9beba57b30f5c4a841d76c15b368010d38a8270a4756c3ac1e18dfc387dfe3a"} Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.204924 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9beba57b30f5c4a841d76c15b368010d38a8270a4756c3ac1e18dfc387dfe3a" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.204888 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-t85xw" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.206781 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564064-wt6kw" event={"ID":"7958dad1-8aea-4918-be5f-9cfd91a229a9","Type":"ContainerDied","Data":"c5ff04b4d473e389b7d521db5a6f306e781b76dc3317071e54a2bb1904db43c1"} Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.206804 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564064-wt6kw" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.206811 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5ff04b4d473e389b7d521db5a6f306e781b76dc3317071e54a2bb1904db43c1" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.208973 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.209048 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-vbm9s" event={"ID":"d9909000-8e99-44f7-9f35-570757a60e59","Type":"ContainerDied","Data":"efa440667c92c35928da2a2d751c3979a34d7a8afeea0cb819535dba3f80ac8c"} Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.209082 4756 scope.go:117] "RemoveContainer" containerID="5d9d11cf69fb28b9cd08dc8e149bc3293ca7150751413dc8440cd248ace5a13b" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.258235 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-config-data\") pod \"61189734-2130-4de6-aa34-c96879ff64de\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.258411 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-combined-ca-bundle\") pod \"61189734-2130-4de6-aa34-c96879ff64de\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.258436 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs8ft\" (UniqueName: \"kubernetes.io/projected/7958dad1-8aea-4918-be5f-9cfd91a229a9-kube-api-access-bs8ft\") pod \"7958dad1-8aea-4918-be5f-9cfd91a229a9\" (UID: \"7958dad1-8aea-4918-be5f-9cfd91a229a9\") " Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.258604 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2m5v\" (UniqueName: \"kubernetes.io/projected/61189734-2130-4de6-aa34-c96879ff64de-kube-api-access-q2m5v\") pod \"61189734-2130-4de6-aa34-c96879ff64de\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.258641 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-scripts\") pod \"61189734-2130-4de6-aa34-c96879ff64de\" (UID: \"61189734-2130-4de6-aa34-c96879ff64de\") " Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.273341 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7958dad1-8aea-4918-be5f-9cfd91a229a9-kube-api-access-bs8ft" (OuterVolumeSpecName: "kube-api-access-bs8ft") pod "7958dad1-8aea-4918-be5f-9cfd91a229a9" (UID: "7958dad1-8aea-4918-be5f-9cfd91a229a9"). InnerVolumeSpecName "kube-api-access-bs8ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.273490 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61189734-2130-4de6-aa34-c96879ff64de-kube-api-access-q2m5v" (OuterVolumeSpecName: "kube-api-access-q2m5v") pod "61189734-2130-4de6-aa34-c96879ff64de" (UID: "61189734-2130-4de6-aa34-c96879ff64de"). InnerVolumeSpecName "kube-api-access-q2m5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.298471 4756 scope.go:117] "RemoveContainer" containerID="301dc2b8c0a915aca527b5629fe8d1ee84f9ca9a6fcb54a44f0a3c237a5cb2aa" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.298663 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-scripts" (OuterVolumeSpecName: "scripts") pod "61189734-2130-4de6-aa34-c96879ff64de" (UID: "61189734-2130-4de6-aa34-c96879ff64de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.310428 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-config-data" (OuterVolumeSpecName: "config-data") pod "61189734-2130-4de6-aa34-c96879ff64de" (UID: "61189734-2130-4de6-aa34-c96879ff64de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.314470 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-vbm9s"] Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.329355 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61189734-2130-4de6-aa34-c96879ff64de" (UID: "61189734-2130-4de6-aa34-c96879ff64de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.346113 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-vbm9s"] Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.374731 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2m5v\" (UniqueName: \"kubernetes.io/projected/61189734-2130-4de6-aa34-c96879ff64de-kube-api-access-q2m5v\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.374813 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.374874 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.374932 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61189734-2130-4de6-aa34-c96879ff64de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.375345 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs8ft\" (UniqueName: \"kubernetes.io/projected/7958dad1-8aea-4918-be5f-9cfd91a229a9-kube-api-access-bs8ft\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.384308 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.384525 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cb14feaf-f9db-4eee-aa10-a2da68219dfc" containerName="nova-api-log" containerID="cri-o://efe3ba966e239c2c2dcdaccd3a1599c0b46d5397106a4588fc828960a6aa24b0" gracePeriod=30 Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.385160 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cb14feaf-f9db-4eee-aa10-a2da68219dfc" containerName="nova-api-api" containerID="cri-o://a5cfbab009eb2f3b7a8d269c82aafdc187e061cb0c0e608b421eca6640c3776a" gracePeriod=30 Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.530377 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.877683 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.984886 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxzlr\" (UniqueName: \"kubernetes.io/projected/a8af5145-9647-4eb0-95f2-4e32d681fc9e-kube-api-access-mxzlr\") pod \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.984968 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-config-data\") pod \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.985039 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-scripts\") pod \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.985161 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-combined-ca-bundle\") pod \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\" (UID: \"a8af5145-9647-4eb0-95f2-4e32d681fc9e\") " Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.990196 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8af5145-9647-4eb0-95f2-4e32d681fc9e-kube-api-access-mxzlr" (OuterVolumeSpecName: "kube-api-access-mxzlr") pod "a8af5145-9647-4eb0-95f2-4e32d681fc9e" (UID: "a8af5145-9647-4eb0-95f2-4e32d681fc9e"). InnerVolumeSpecName "kube-api-access-mxzlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:05 crc kubenswrapper[4756]: I0318 14:24:05.992964 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-scripts" (OuterVolumeSpecName: "scripts") pod "a8af5145-9647-4eb0-95f2-4e32d681fc9e" (UID: "a8af5145-9647-4eb0-95f2-4e32d681fc9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.025497 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8af5145-9647-4eb0-95f2-4e32d681fc9e" (UID: "a8af5145-9647-4eb0-95f2-4e32d681fc9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.026672 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-config-data" (OuterVolumeSpecName: "config-data") pod "a8af5145-9647-4eb0-95f2-4e32d681fc9e" (UID: "a8af5145-9647-4eb0-95f2-4e32d681fc9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.087735 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxzlr\" (UniqueName: \"kubernetes.io/projected/a8af5145-9647-4eb0-95f2-4e32d681fc9e-kube-api-access-mxzlr\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.087772 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.087782 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.087791 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af5145-9647-4eb0-95f2-4e32d681fc9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.147365 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564058-qrjcw"] Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.156992 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564058-qrjcw"] Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.222704 4756 generic.go:334] "Generic (PLEG): container finished" podID="cb14feaf-f9db-4eee-aa10-a2da68219dfc" containerID="efe3ba966e239c2c2dcdaccd3a1599c0b46d5397106a4588fc828960a6aa24b0" exitCode=143 Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.222784 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb14feaf-f9db-4eee-aa10-a2da68219dfc","Type":"ContainerDied","Data":"efe3ba966e239c2c2dcdaccd3a1599c0b46d5397106a4588fc828960a6aa24b0"} Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.239506 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b64xk" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.239556 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b64xk" event={"ID":"a8af5145-9647-4eb0-95f2-4e32d681fc9e","Type":"ContainerDied","Data":"742e76e16df40d40bc9030aad7316bb8a31dc6f2d9b3a76d39e584f04c875351"} Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.239585 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="742e76e16df40d40bc9030aad7316bb8a31dc6f2d9b3a76d39e584f04c875351" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.239618 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9df05fbc-2aca-4054-9b21-b0bfe7436077" containerName="nova-scheduler-scheduler" containerID="cri-o://0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece" gracePeriod=30 Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.291161 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 14:24:06 crc kubenswrapper[4756]: E0318 14:24:06.291603 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9909000-8e99-44f7-9f35-570757a60e59" containerName="dnsmasq-dns" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.291620 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9909000-8e99-44f7-9f35-570757a60e59" containerName="dnsmasq-dns" Mar 18 14:24:06 crc kubenswrapper[4756]: E0318 14:24:06.291636 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61189734-2130-4de6-aa34-c96879ff64de" containerName="nova-manage" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.291641 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="61189734-2130-4de6-aa34-c96879ff64de" containerName="nova-manage" Mar 18 14:24:06 crc kubenswrapper[4756]: E0318 14:24:06.291668 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8af5145-9647-4eb0-95f2-4e32d681fc9e" containerName="nova-cell1-conductor-db-sync" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.291674 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8af5145-9647-4eb0-95f2-4e32d681fc9e" containerName="nova-cell1-conductor-db-sync" Mar 18 14:24:06 crc kubenswrapper[4756]: E0318 14:24:06.291687 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9909000-8e99-44f7-9f35-570757a60e59" containerName="init" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.291692 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9909000-8e99-44f7-9f35-570757a60e59" containerName="init" Mar 18 14:24:06 crc kubenswrapper[4756]: E0318 14:24:06.291708 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7958dad1-8aea-4918-be5f-9cfd91a229a9" containerName="oc" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.291714 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="7958dad1-8aea-4918-be5f-9cfd91a229a9" containerName="oc" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.291903 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9909000-8e99-44f7-9f35-570757a60e59" containerName="dnsmasq-dns" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.291915 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="61189734-2130-4de6-aa34-c96879ff64de" containerName="nova-manage" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.291937 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="7958dad1-8aea-4918-be5f-9cfd91a229a9" containerName="oc" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.291944 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8af5145-9647-4eb0-95f2-4e32d681fc9e" containerName="nova-cell1-conductor-db-sync" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.292699 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.297961 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.318635 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.393747 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bf60e6c-1de0-4b66-84fe-635d0d235bad-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8bf60e6c-1de0-4b66-84fe-635d0d235bad\") " pod="openstack/nova-cell1-conductor-0" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.393795 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdfwl\" (UniqueName: \"kubernetes.io/projected/8bf60e6c-1de0-4b66-84fe-635d0d235bad-kube-api-access-fdfwl\") pod \"nova-cell1-conductor-0\" (UID: \"8bf60e6c-1de0-4b66-84fe-635d0d235bad\") " pod="openstack/nova-cell1-conductor-0" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.393851 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bf60e6c-1de0-4b66-84fe-635d0d235bad-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8bf60e6c-1de0-4b66-84fe-635d0d235bad\") " pod="openstack/nova-cell1-conductor-0" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.495760 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdfwl\" (UniqueName: \"kubernetes.io/projected/8bf60e6c-1de0-4b66-84fe-635d0d235bad-kube-api-access-fdfwl\") pod \"nova-cell1-conductor-0\" (UID: \"8bf60e6c-1de0-4b66-84fe-635d0d235bad\") " pod="openstack/nova-cell1-conductor-0" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.495834 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bf60e6c-1de0-4b66-84fe-635d0d235bad-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8bf60e6c-1de0-4b66-84fe-635d0d235bad\") " pod="openstack/nova-cell1-conductor-0" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.495992 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bf60e6c-1de0-4b66-84fe-635d0d235bad-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8bf60e6c-1de0-4b66-84fe-635d0d235bad\") " pod="openstack/nova-cell1-conductor-0" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.500687 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bf60e6c-1de0-4b66-84fe-635d0d235bad-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8bf60e6c-1de0-4b66-84fe-635d0d235bad\") " pod="openstack/nova-cell1-conductor-0" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.501586 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bf60e6c-1de0-4b66-84fe-635d0d235bad-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8bf60e6c-1de0-4b66-84fe-635d0d235bad\") " pod="openstack/nova-cell1-conductor-0" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.512010 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdfwl\" (UniqueName: \"kubernetes.io/projected/8bf60e6c-1de0-4b66-84fe-635d0d235bad-kube-api-access-fdfwl\") pod \"nova-cell1-conductor-0\" (UID: \"8bf60e6c-1de0-4b66-84fe-635d0d235bad\") " pod="openstack/nova-cell1-conductor-0" Mar 18 14:24:06 crc kubenswrapper[4756]: I0318 14:24:06.612501 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 14:24:07 crc kubenswrapper[4756]: I0318 14:24:07.087221 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 14:24:07 crc kubenswrapper[4756]: I0318 14:24:07.251928 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8bf60e6c-1de0-4b66-84fe-635d0d235bad","Type":"ContainerStarted","Data":"8804e85e1d28ca79200c8af3efaaf207f28fd1d21bba5b0da33c16d0bf442d52"} Mar 18 14:24:07 crc kubenswrapper[4756]: I0318 14:24:07.329695 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a05ddf2-3ab0-4c2b-b697-8609cd540ffe" path="/var/lib/kubelet/pods/2a05ddf2-3ab0-4c2b-b697-8609cd540ffe/volumes" Mar 18 14:24:07 crc kubenswrapper[4756]: I0318 14:24:07.332314 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9909000-8e99-44f7-9f35-570757a60e59" path="/var/lib/kubelet/pods/d9909000-8e99-44f7-9f35-570757a60e59/volumes" Mar 18 14:24:08 crc kubenswrapper[4756]: I0318 14:24:08.261773 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8bf60e6c-1de0-4b66-84fe-635d0d235bad","Type":"ContainerStarted","Data":"5c0bd465d96412f81acc8921a5c4335dbfd1f65bf8fe347dac3933dafe957381"} Mar 18 14:24:08 crc kubenswrapper[4756]: I0318 14:24:08.262690 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 14:24:08 crc kubenswrapper[4756]: I0318 14:24:08.277510 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.277495661 podStartE2EDuration="2.277495661s" podCreationTimestamp="2026-03-18 14:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:24:08.276417533 +0000 UTC m=+1449.590835508" watchObservedRunningTime="2026-03-18 14:24:08.277495661 +0000 UTC m=+1449.591913636" Mar 18 14:24:08 crc kubenswrapper[4756]: E0318 14:24:08.460195 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 14:24:08 crc kubenswrapper[4756]: E0318 14:24:08.461688 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 14:24:08 crc kubenswrapper[4756]: E0318 14:24:08.462960 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 14:24:08 crc kubenswrapper[4756]: E0318 14:24:08.462997 4756 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9df05fbc-2aca-4054-9b21-b0bfe7436077" containerName="nova-scheduler-scheduler" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.082873 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.083477 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.149983 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.292079 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nbv9\" (UniqueName: \"kubernetes.io/projected/9df05fbc-2aca-4054-9b21-b0bfe7436077-kube-api-access-9nbv9\") pod \"9df05fbc-2aca-4054-9b21-b0bfe7436077\" (UID: \"9df05fbc-2aca-4054-9b21-b0bfe7436077\") " Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.292299 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df05fbc-2aca-4054-9b21-b0bfe7436077-config-data\") pod \"9df05fbc-2aca-4054-9b21-b0bfe7436077\" (UID: \"9df05fbc-2aca-4054-9b21-b0bfe7436077\") " Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.292339 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df05fbc-2aca-4054-9b21-b0bfe7436077-combined-ca-bundle\") pod \"9df05fbc-2aca-4054-9b21-b0bfe7436077\" (UID: \"9df05fbc-2aca-4054-9b21-b0bfe7436077\") " Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.299175 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9df05fbc-2aca-4054-9b21-b0bfe7436077-kube-api-access-9nbv9" (OuterVolumeSpecName: "kube-api-access-9nbv9") pod "9df05fbc-2aca-4054-9b21-b0bfe7436077" (UID: "9df05fbc-2aca-4054-9b21-b0bfe7436077"). InnerVolumeSpecName "kube-api-access-9nbv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.300016 4756 generic.go:334] "Generic (PLEG): container finished" podID="9df05fbc-2aca-4054-9b21-b0bfe7436077" containerID="0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece" exitCode=0 Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.300113 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9df05fbc-2aca-4054-9b21-b0bfe7436077","Type":"ContainerDied","Data":"0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece"} Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.300162 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.300177 4756 scope.go:117] "RemoveContainer" containerID="0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.300164 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9df05fbc-2aca-4054-9b21-b0bfe7436077","Type":"ContainerDied","Data":"761d3781e56c9c3d63e661d829a9c638f19123b7a94fa5ec4976188362e6dbb7"} Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.302895 4756 generic.go:334] "Generic (PLEG): container finished" podID="cb14feaf-f9db-4eee-aa10-a2da68219dfc" containerID="a5cfbab009eb2f3b7a8d269c82aafdc187e061cb0c0e608b421eca6640c3776a" exitCode=0 Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.302944 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb14feaf-f9db-4eee-aa10-a2da68219dfc","Type":"ContainerDied","Data":"a5cfbab009eb2f3b7a8d269c82aafdc187e061cb0c0e608b421eca6640c3776a"} Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.326663 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9df05fbc-2aca-4054-9b21-b0bfe7436077-config-data" (OuterVolumeSpecName: "config-data") pod "9df05fbc-2aca-4054-9b21-b0bfe7436077" (UID: "9df05fbc-2aca-4054-9b21-b0bfe7436077"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.333867 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9df05fbc-2aca-4054-9b21-b0bfe7436077-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9df05fbc-2aca-4054-9b21-b0bfe7436077" (UID: "9df05fbc-2aca-4054-9b21-b0bfe7436077"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.387457 4756 scope.go:117] "RemoveContainer" containerID="0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece" Mar 18 14:24:11 crc kubenswrapper[4756]: E0318 14:24:11.387887 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece\": container with ID starting with 0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece not found: ID does not exist" containerID="0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.387918 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece"} err="failed to get container status \"0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece\": rpc error: code = NotFound desc = could not find container \"0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece\": container with ID starting with 0326d133e225d46793dd699f0440173c8d107a279b03152820da2936d00c3ece not found: ID does not exist" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.394612 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df05fbc-2aca-4054-9b21-b0bfe7436077-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.394640 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df05fbc-2aca-4054-9b21-b0bfe7436077-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.394651 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nbv9\" (UniqueName: \"kubernetes.io/projected/9df05fbc-2aca-4054-9b21-b0bfe7436077-kube-api-access-9nbv9\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.394712 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.394750 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.571209 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.657635 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.670855 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.679999 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:24:11 crc kubenswrapper[4756]: E0318 14:24:11.680579 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb14feaf-f9db-4eee-aa10-a2da68219dfc" containerName="nova-api-api" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.680602 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb14feaf-f9db-4eee-aa10-a2da68219dfc" containerName="nova-api-api" Mar 18 14:24:11 crc kubenswrapper[4756]: E0318 14:24:11.680654 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb14feaf-f9db-4eee-aa10-a2da68219dfc" containerName="nova-api-log" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.680665 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb14feaf-f9db-4eee-aa10-a2da68219dfc" containerName="nova-api-log" Mar 18 14:24:11 crc kubenswrapper[4756]: E0318 14:24:11.680681 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df05fbc-2aca-4054-9b21-b0bfe7436077" containerName="nova-scheduler-scheduler" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.680689 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df05fbc-2aca-4054-9b21-b0bfe7436077" containerName="nova-scheduler-scheduler" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.680924 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb14feaf-f9db-4eee-aa10-a2da68219dfc" containerName="nova-api-api" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.680959 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb14feaf-f9db-4eee-aa10-a2da68219dfc" containerName="nova-api-log" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.680986 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9df05fbc-2aca-4054-9b21-b0bfe7436077" containerName="nova-scheduler-scheduler" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.682106 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.684369 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.691451 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.726376 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb14feaf-f9db-4eee-aa10-a2da68219dfc-config-data\") pod \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.726978 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb14feaf-f9db-4eee-aa10-a2da68219dfc-logs\") pod \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.727017 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb14feaf-f9db-4eee-aa10-a2da68219dfc-combined-ca-bundle\") pod \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.727067 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69rwd\" (UniqueName: \"kubernetes.io/projected/cb14feaf-f9db-4eee-aa10-a2da68219dfc-kube-api-access-69rwd\") pod \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\" (UID: \"cb14feaf-f9db-4eee-aa10-a2da68219dfc\") " Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.732037 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb14feaf-f9db-4eee-aa10-a2da68219dfc-logs" (OuterVolumeSpecName: "logs") pod "cb14feaf-f9db-4eee-aa10-a2da68219dfc" (UID: "cb14feaf-f9db-4eee-aa10-a2da68219dfc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.748279 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb14feaf-f9db-4eee-aa10-a2da68219dfc-kube-api-access-69rwd" (OuterVolumeSpecName: "kube-api-access-69rwd") pod "cb14feaf-f9db-4eee-aa10-a2da68219dfc" (UID: "cb14feaf-f9db-4eee-aa10-a2da68219dfc"). InnerVolumeSpecName "kube-api-access-69rwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.799914 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb14feaf-f9db-4eee-aa10-a2da68219dfc-config-data" (OuterVolumeSpecName: "config-data") pod "cb14feaf-f9db-4eee-aa10-a2da68219dfc" (UID: "cb14feaf-f9db-4eee-aa10-a2da68219dfc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.801157 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb14feaf-f9db-4eee-aa10-a2da68219dfc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb14feaf-f9db-4eee-aa10-a2da68219dfc" (UID: "cb14feaf-f9db-4eee-aa10-a2da68219dfc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.829264 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ffabff-92b2-4f54-b65e-eb3a0104f61e-config-data\") pod \"nova-scheduler-0\" (UID: \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.829382 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ffabff-92b2-4f54-b65e-eb3a0104f61e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.829624 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwwqt\" (UniqueName: \"kubernetes.io/projected/09ffabff-92b2-4f54-b65e-eb3a0104f61e-kube-api-access-pwwqt\") pod \"nova-scheduler-0\" (UID: \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.830045 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb14feaf-f9db-4eee-aa10-a2da68219dfc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.830072 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb14feaf-f9db-4eee-aa10-a2da68219dfc-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.830087 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb14feaf-f9db-4eee-aa10-a2da68219dfc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:11 crc kubenswrapper[4756]: I0318 14:24:11.830103 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69rwd\" (UniqueName: \"kubernetes.io/projected/cb14feaf-f9db-4eee-aa10-a2da68219dfc-kube-api-access-69rwd\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:11.931841 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ffabff-92b2-4f54-b65e-eb3a0104f61e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:11.932548 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwwqt\" (UniqueName: \"kubernetes.io/projected/09ffabff-92b2-4f54-b65e-eb3a0104f61e-kube-api-access-pwwqt\") pod \"nova-scheduler-0\" (UID: \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:11.932639 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ffabff-92b2-4f54-b65e-eb3a0104f61e-config-data\") pod \"nova-scheduler-0\" (UID: \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:11.936077 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ffabff-92b2-4f54-b65e-eb3a0104f61e-config-data\") pod \"nova-scheduler-0\" (UID: \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:11.936884 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ffabff-92b2-4f54-b65e-eb3a0104f61e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:11.949216 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwwqt\" (UniqueName: \"kubernetes.io/projected/09ffabff-92b2-4f54-b65e-eb3a0104f61e-kube-api-access-pwwqt\") pod \"nova-scheduler-0\" (UID: \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:11.996310 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.316100 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb14feaf-f9db-4eee-aa10-a2da68219dfc","Type":"ContainerDied","Data":"2fd656fa1b10ca64430c30aa0f24c15c9f7ce4dd91333a0a940d49fad84185df"} Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.316167 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.316179 4756 scope.go:117] "RemoveContainer" containerID="a5cfbab009eb2f3b7a8d269c82aafdc187e061cb0c0e608b421eca6640c3776a" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.372228 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.375714 4756 scope.go:117] "RemoveContainer" containerID="efe3ba966e239c2c2dcdaccd3a1599c0b46d5397106a4588fc828960a6aa24b0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.395288 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.412405 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.414705 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.417275 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.423497 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.545824 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gghzv\" (UniqueName: \"kubernetes.io/projected/9e6cd880-e90d-45aa-a0df-1062b926d00d-kube-api-access-gghzv\") pod \"nova-api-0\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.545925 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6cd880-e90d-45aa-a0df-1062b926d00d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.546027 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e6cd880-e90d-45aa-a0df-1062b926d00d-config-data\") pod \"nova-api-0\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.546072 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e6cd880-e90d-45aa-a0df-1062b926d00d-logs\") pod \"nova-api-0\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.647814 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gghzv\" (UniqueName: \"kubernetes.io/projected/9e6cd880-e90d-45aa-a0df-1062b926d00d-kube-api-access-gghzv\") pod \"nova-api-0\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.647883 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6cd880-e90d-45aa-a0df-1062b926d00d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.647944 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e6cd880-e90d-45aa-a0df-1062b926d00d-config-data\") pod \"nova-api-0\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.647995 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e6cd880-e90d-45aa-a0df-1062b926d00d-logs\") pod \"nova-api-0\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.650638 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e6cd880-e90d-45aa-a0df-1062b926d00d-logs\") pod \"nova-api-0\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.662289 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e6cd880-e90d-45aa-a0df-1062b926d00d-config-data\") pod \"nova-api-0\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.668920 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gghzv\" (UniqueName: \"kubernetes.io/projected/9e6cd880-e90d-45aa-a0df-1062b926d00d-kube-api-access-gghzv\") pod \"nova-api-0\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.675166 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6cd880-e90d-45aa-a0df-1062b926d00d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.731356 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:24:12 crc kubenswrapper[4756]: I0318 14:24:12.839001 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:24:13 crc kubenswrapper[4756]: I0318 14:24:13.334685 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9df05fbc-2aca-4054-9b21-b0bfe7436077" path="/var/lib/kubelet/pods/9df05fbc-2aca-4054-9b21-b0bfe7436077/volumes" Mar 18 14:24:13 crc kubenswrapper[4756]: I0318 14:24:13.335603 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb14feaf-f9db-4eee-aa10-a2da68219dfc" path="/var/lib/kubelet/pods/cb14feaf-f9db-4eee-aa10-a2da68219dfc/volumes" Mar 18 14:24:13 crc kubenswrapper[4756]: I0318 14:24:13.336384 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09ffabff-92b2-4f54-b65e-eb3a0104f61e","Type":"ContainerStarted","Data":"9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1"} Mar 18 14:24:13 crc kubenswrapper[4756]: I0318 14:24:13.336426 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09ffabff-92b2-4f54-b65e-eb3a0104f61e","Type":"ContainerStarted","Data":"50fb464daf8c021758506350ee04dbb9b32bd016ab44359083d201bf1d2a9913"} Mar 18 14:24:13 crc kubenswrapper[4756]: I0318 14:24:13.355943 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.355904577 podStartE2EDuration="2.355904577s" podCreationTimestamp="2026-03-18 14:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:24:13.347819848 +0000 UTC m=+1454.662237823" watchObservedRunningTime="2026-03-18 14:24:13.355904577 +0000 UTC m=+1454.670322582" Mar 18 14:24:13 crc kubenswrapper[4756]: I0318 14:24:13.401109 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:24:13 crc kubenswrapper[4756]: I0318 14:24:13.568086 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 14:24:14 crc kubenswrapper[4756]: I0318 14:24:14.351282 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9e6cd880-e90d-45aa-a0df-1062b926d00d","Type":"ContainerStarted","Data":"625555a70f8ea9b9ae576b12ef7f09296f198b89b65c6e0fea52df5df4fd2517"} Mar 18 14:24:14 crc kubenswrapper[4756]: I0318 14:24:14.351541 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9e6cd880-e90d-45aa-a0df-1062b926d00d","Type":"ContainerStarted","Data":"0fe85479583a1509af78f8d77bee8b1973bccad57fb0aae5c477a285b612f99f"} Mar 18 14:24:14 crc kubenswrapper[4756]: I0318 14:24:14.351557 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9e6cd880-e90d-45aa-a0df-1062b926d00d","Type":"ContainerStarted","Data":"61d1e3031e03ea8d9ad004cd1e414c5b5ff2e26e47e425bd33501e072dca8b7f"} Mar 18 14:24:14 crc kubenswrapper[4756]: I0318 14:24:14.370739 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.370717136 podStartE2EDuration="2.370717136s" podCreationTimestamp="2026-03-18 14:24:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:24:14.36789841 +0000 UTC m=+1455.682316385" watchObservedRunningTime="2026-03-18 14:24:14.370717136 +0000 UTC m=+1455.685135111" Mar 18 14:24:16 crc kubenswrapper[4756]: I0318 14:24:16.651723 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 14:24:16 crc kubenswrapper[4756]: I0318 14:24:16.996741 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 14:24:18 crc kubenswrapper[4756]: I0318 14:24:18.003792 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 14:24:18 crc kubenswrapper[4756]: I0318 14:24:18.004304 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="67dc4771-f106-4f33-9f84-0d7251e4259d" containerName="kube-state-metrics" containerID="cri-o://5f570f3352803d95778137ed7834474549a34afc0b73569b324174ff4f99f7b5" gracePeriod=30 Mar 18 14:24:18 crc kubenswrapper[4756]: I0318 14:24:18.388142 4756 generic.go:334] "Generic (PLEG): container finished" podID="67dc4771-f106-4f33-9f84-0d7251e4259d" containerID="5f570f3352803d95778137ed7834474549a34afc0b73569b324174ff4f99f7b5" exitCode=2 Mar 18 14:24:18 crc kubenswrapper[4756]: I0318 14:24:18.388409 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"67dc4771-f106-4f33-9f84-0d7251e4259d","Type":"ContainerDied","Data":"5f570f3352803d95778137ed7834474549a34afc0b73569b324174ff4f99f7b5"} Mar 18 14:24:18 crc kubenswrapper[4756]: I0318 14:24:18.859078 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 14:24:18 crc kubenswrapper[4756]: I0318 14:24:18.986300 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46x77\" (UniqueName: \"kubernetes.io/projected/67dc4771-f106-4f33-9f84-0d7251e4259d-kube-api-access-46x77\") pod \"67dc4771-f106-4f33-9f84-0d7251e4259d\" (UID: \"67dc4771-f106-4f33-9f84-0d7251e4259d\") " Mar 18 14:24:18 crc kubenswrapper[4756]: I0318 14:24:18.999460 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67dc4771-f106-4f33-9f84-0d7251e4259d-kube-api-access-46x77" (OuterVolumeSpecName: "kube-api-access-46x77") pod "67dc4771-f106-4f33-9f84-0d7251e4259d" (UID: "67dc4771-f106-4f33-9f84-0d7251e4259d"). InnerVolumeSpecName "kube-api-access-46x77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.108091 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46x77\" (UniqueName: \"kubernetes.io/projected/67dc4771-f106-4f33-9f84-0d7251e4259d-kube-api-access-46x77\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.398717 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"67dc4771-f106-4f33-9f84-0d7251e4259d","Type":"ContainerDied","Data":"68c770138fc5e00ede674effb0b25a284c8f570dfecf7c4a1f6315c49e4168a9"} Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.398797 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.398991 4756 scope.go:117] "RemoveContainer" containerID="5f570f3352803d95778137ed7834474549a34afc0b73569b324174ff4f99f7b5" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.429694 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.438634 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.449144 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 14:24:19 crc kubenswrapper[4756]: E0318 14:24:19.449647 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dc4771-f106-4f33-9f84-0d7251e4259d" containerName="kube-state-metrics" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.449663 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dc4771-f106-4f33-9f84-0d7251e4259d" containerName="kube-state-metrics" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.449829 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="67dc4771-f106-4f33-9f84-0d7251e4259d" containerName="kube-state-metrics" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.450625 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.453618 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.462108 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.463160 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.517514 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7707ba89-0a52-4e4e-bb4d-d381e3663d46-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7707ba89-0a52-4e4e-bb4d-d381e3663d46\") " pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.517568 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7707ba89-0a52-4e4e-bb4d-d381e3663d46-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7707ba89-0a52-4e4e-bb4d-d381e3663d46\") " pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.517789 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7707ba89-0a52-4e4e-bb4d-d381e3663d46-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7707ba89-0a52-4e4e-bb4d-d381e3663d46\") " pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.517843 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtnr2\" (UniqueName: \"kubernetes.io/projected/7707ba89-0a52-4e4e-bb4d-d381e3663d46-kube-api-access-rtnr2\") pod \"kube-state-metrics-0\" (UID: \"7707ba89-0a52-4e4e-bb4d-d381e3663d46\") " pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.620477 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7707ba89-0a52-4e4e-bb4d-d381e3663d46-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7707ba89-0a52-4e4e-bb4d-d381e3663d46\") " pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.620541 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtnr2\" (UniqueName: \"kubernetes.io/projected/7707ba89-0a52-4e4e-bb4d-d381e3663d46-kube-api-access-rtnr2\") pod \"kube-state-metrics-0\" (UID: \"7707ba89-0a52-4e4e-bb4d-d381e3663d46\") " pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.620738 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7707ba89-0a52-4e4e-bb4d-d381e3663d46-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7707ba89-0a52-4e4e-bb4d-d381e3663d46\") " pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.620790 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7707ba89-0a52-4e4e-bb4d-d381e3663d46-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7707ba89-0a52-4e4e-bb4d-d381e3663d46\") " pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.626078 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7707ba89-0a52-4e4e-bb4d-d381e3663d46-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7707ba89-0a52-4e4e-bb4d-d381e3663d46\") " pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.627480 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7707ba89-0a52-4e4e-bb4d-d381e3663d46-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7707ba89-0a52-4e4e-bb4d-d381e3663d46\") " pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.630678 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7707ba89-0a52-4e4e-bb4d-d381e3663d46-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7707ba89-0a52-4e4e-bb4d-d381e3663d46\") " pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.637823 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtnr2\" (UniqueName: \"kubernetes.io/projected/7707ba89-0a52-4e4e-bb4d-d381e3663d46-kube-api-access-rtnr2\") pod \"kube-state-metrics-0\" (UID: \"7707ba89-0a52-4e4e-bb4d-d381e3663d46\") " pod="openstack/kube-state-metrics-0" Mar 18 14:24:19 crc kubenswrapper[4756]: I0318 14:24:19.779681 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 14:24:20 crc kubenswrapper[4756]: I0318 14:24:20.245756 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 14:24:20 crc kubenswrapper[4756]: I0318 14:24:20.322846 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:20 crc kubenswrapper[4756]: I0318 14:24:20.323374 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="ceilometer-central-agent" containerID="cri-o://3bc13efaf539ce4c0c688e4a291261c3e82f4f0ba635d8228a20bf872c546483" gracePeriod=30 Mar 18 14:24:20 crc kubenswrapper[4756]: I0318 14:24:20.323395 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="proxy-httpd" containerID="cri-o://ce1864ce85c999a6a2848ddb0d2b55c29fd72f911b56e5e1717f60858e300f42" gracePeriod=30 Mar 18 14:24:20 crc kubenswrapper[4756]: I0318 14:24:20.323412 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="sg-core" containerID="cri-o://0b042a4e40d2d56323798913402c426b205d14c767658556b71b2514944bb5bf" gracePeriod=30 Mar 18 14:24:20 crc kubenswrapper[4756]: I0318 14:24:20.323503 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="ceilometer-notification-agent" containerID="cri-o://bcb153a107e360625e52455bf535271b34b8025a3f6bdfd4728ed1247eea38dd" gracePeriod=30 Mar 18 14:24:20 crc kubenswrapper[4756]: I0318 14:24:20.410735 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7707ba89-0a52-4e4e-bb4d-d381e3663d46","Type":"ContainerStarted","Data":"bd6a0b56211fbe1ca24d500460d110a807292870fb3c83f4d10e2e928f51f5f8"} Mar 18 14:24:21 crc kubenswrapper[4756]: I0318 14:24:21.326929 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67dc4771-f106-4f33-9f84-0d7251e4259d" path="/var/lib/kubelet/pods/67dc4771-f106-4f33-9f84-0d7251e4259d/volumes" Mar 18 14:24:21 crc kubenswrapper[4756]: I0318 14:24:21.423099 4756 generic.go:334] "Generic (PLEG): container finished" podID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerID="ce1864ce85c999a6a2848ddb0d2b55c29fd72f911b56e5e1717f60858e300f42" exitCode=0 Mar 18 14:24:21 crc kubenswrapper[4756]: I0318 14:24:21.423164 4756 generic.go:334] "Generic (PLEG): container finished" podID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerID="0b042a4e40d2d56323798913402c426b205d14c767658556b71b2514944bb5bf" exitCode=2 Mar 18 14:24:21 crc kubenswrapper[4756]: I0318 14:24:21.423173 4756 generic.go:334] "Generic (PLEG): container finished" podID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerID="3bc13efaf539ce4c0c688e4a291261c3e82f4f0ba635d8228a20bf872c546483" exitCode=0 Mar 18 14:24:21 crc kubenswrapper[4756]: I0318 14:24:21.423212 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2296fb-bd22-4c18-9ecb-7c6ad419b336","Type":"ContainerDied","Data":"ce1864ce85c999a6a2848ddb0d2b55c29fd72f911b56e5e1717f60858e300f42"} Mar 18 14:24:21 crc kubenswrapper[4756]: I0318 14:24:21.423242 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2296fb-bd22-4c18-9ecb-7c6ad419b336","Type":"ContainerDied","Data":"0b042a4e40d2d56323798913402c426b205d14c767658556b71b2514944bb5bf"} Mar 18 14:24:21 crc kubenswrapper[4756]: I0318 14:24:21.423256 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2296fb-bd22-4c18-9ecb-7c6ad419b336","Type":"ContainerDied","Data":"3bc13efaf539ce4c0c688e4a291261c3e82f4f0ba635d8228a20bf872c546483"} Mar 18 14:24:21 crc kubenswrapper[4756]: I0318 14:24:21.424460 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7707ba89-0a52-4e4e-bb4d-d381e3663d46","Type":"ContainerStarted","Data":"72964bc74bc88976122865b87930a5692698732add40f0a7786c7acc84bfdda1"} Mar 18 14:24:21 crc kubenswrapper[4756]: I0318 14:24:21.425739 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 14:24:21 crc kubenswrapper[4756]: I0318 14:24:21.449239 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.084620525 podStartE2EDuration="2.449220053s" podCreationTimestamp="2026-03-18 14:24:19 +0000 UTC" firstStartedPulling="2026-03-18 14:24:20.249429682 +0000 UTC m=+1461.563847657" lastFinishedPulling="2026-03-18 14:24:20.61402921 +0000 UTC m=+1461.928447185" observedRunningTime="2026-03-18 14:24:21.441209496 +0000 UTC m=+1462.755627481" watchObservedRunningTime="2026-03-18 14:24:21.449220053 +0000 UTC m=+1462.763638028" Mar 18 14:24:21 crc kubenswrapper[4756]: I0318 14:24:21.997229 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 14:24:22 crc kubenswrapper[4756]: I0318 14:24:22.025975 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 14:24:22 crc kubenswrapper[4756]: I0318 14:24:22.436830 4756 generic.go:334] "Generic (PLEG): container finished" podID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerID="bcb153a107e360625e52455bf535271b34b8025a3f6bdfd4728ed1247eea38dd" exitCode=0 Mar 18 14:24:22 crc kubenswrapper[4756]: I0318 14:24:22.436913 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2296fb-bd22-4c18-9ecb-7c6ad419b336","Type":"ContainerDied","Data":"bcb153a107e360625e52455bf535271b34b8025a3f6bdfd4728ed1247eea38dd"} Mar 18 14:24:22 crc kubenswrapper[4756]: I0318 14:24:22.473713 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 14:24:22 crc kubenswrapper[4756]: I0318 14:24:22.732694 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 14:24:22 crc kubenswrapper[4756]: I0318 14:24:22.733092 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.098706 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.203884 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-scripts\") pod \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.204016 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-combined-ca-bundle\") pod \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.204171 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-config-data\") pod \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.204243 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-run-httpd\") pod \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.204283 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-log-httpd\") pod \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.204384 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vpnd\" (UniqueName: \"kubernetes.io/projected/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-kube-api-access-4vpnd\") pod \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.204448 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-sg-core-conf-yaml\") pod \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\" (UID: \"ed2296fb-bd22-4c18-9ecb-7c6ad419b336\") " Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.205642 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ed2296fb-bd22-4c18-9ecb-7c6ad419b336" (UID: "ed2296fb-bd22-4c18-9ecb-7c6ad419b336"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.205680 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ed2296fb-bd22-4c18-9ecb-7c6ad419b336" (UID: "ed2296fb-bd22-4c18-9ecb-7c6ad419b336"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.213155 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-scripts" (OuterVolumeSpecName: "scripts") pod "ed2296fb-bd22-4c18-9ecb-7c6ad419b336" (UID: "ed2296fb-bd22-4c18-9ecb-7c6ad419b336"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.213961 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-kube-api-access-4vpnd" (OuterVolumeSpecName: "kube-api-access-4vpnd") pod "ed2296fb-bd22-4c18-9ecb-7c6ad419b336" (UID: "ed2296fb-bd22-4c18-9ecb-7c6ad419b336"). InnerVolumeSpecName "kube-api-access-4vpnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.305335 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ed2296fb-bd22-4c18-9ecb-7c6ad419b336" (UID: "ed2296fb-bd22-4c18-9ecb-7c6ad419b336"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.306898 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.306926 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.306943 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.306957 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vpnd\" (UniqueName: \"kubernetes.io/projected/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-kube-api-access-4vpnd\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.306974 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.347334 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed2296fb-bd22-4c18-9ecb-7c6ad419b336" (UID: "ed2296fb-bd22-4c18-9ecb-7c6ad419b336"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.363139 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-config-data" (OuterVolumeSpecName: "config-data") pod "ed2296fb-bd22-4c18-9ecb-7c6ad419b336" (UID: "ed2296fb-bd22-4c18-9ecb-7c6ad419b336"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.409687 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.409719 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2296fb-bd22-4c18-9ecb-7c6ad419b336-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.448684 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2296fb-bd22-4c18-9ecb-7c6ad419b336","Type":"ContainerDied","Data":"9f339225384eb4e80c7aa8e8e45597ddc9748a273e29aab180b69de3775c6c5b"} Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.448762 4756 scope.go:117] "RemoveContainer" containerID="ce1864ce85c999a6a2848ddb0d2b55c29fd72f911b56e5e1717f60858e300f42" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.448783 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.482013 4756 scope.go:117] "RemoveContainer" containerID="0b042a4e40d2d56323798913402c426b205d14c767658556b71b2514944bb5bf" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.495156 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.511537 4756 scope.go:117] "RemoveContainer" containerID="bcb153a107e360625e52455bf535271b34b8025a3f6bdfd4728ed1247eea38dd" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.512975 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.526160 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:23 crc kubenswrapper[4756]: E0318 14:24:23.526641 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="ceilometer-central-agent" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.526654 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="ceilometer-central-agent" Mar 18 14:24:23 crc kubenswrapper[4756]: E0318 14:24:23.526666 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="sg-core" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.526672 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="sg-core" Mar 18 14:24:23 crc kubenswrapper[4756]: E0318 14:24:23.526685 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="proxy-httpd" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.526691 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="proxy-httpd" Mar 18 14:24:23 crc kubenswrapper[4756]: E0318 14:24:23.526713 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="ceilometer-notification-agent" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.526720 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="ceilometer-notification-agent" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.526904 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="ceilometer-notification-agent" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.526921 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="ceilometer-central-agent" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.526931 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="proxy-httpd" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.526942 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" containerName="sg-core" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.529660 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.540883 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.541491 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.541759 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.543249 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.545560 4756 scope.go:117] "RemoveContainer" containerID="3bc13efaf539ce4c0c688e4a291261c3e82f4f0ba635d8228a20bf872c546483" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.614258 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-log-httpd\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.614307 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-config-data\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.614585 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.614785 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.614965 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-run-httpd\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.614999 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqkpr\" (UniqueName: \"kubernetes.io/projected/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-kube-api-access-qqkpr\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.615067 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-scripts\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.615146 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.716674 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.716765 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-run-httpd\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.716786 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqkpr\" (UniqueName: \"kubernetes.io/projected/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-kube-api-access-qqkpr\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.716812 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-scripts\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.716834 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.716883 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-log-httpd\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.716902 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-config-data\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.716958 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.717322 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-run-httpd\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.717629 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-log-httpd\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.722162 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.724042 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.725034 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.725083 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-scripts\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.725310 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-config-data\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.736056 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqkpr\" (UniqueName: \"kubernetes.io/projected/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-kube-api-access-qqkpr\") pod \"ceilometer-0\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " pod="openstack/ceilometer-0" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.815356 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9e6cd880-e90d-45aa-a0df-1062b926d00d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.228:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.815372 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9e6cd880-e90d-45aa-a0df-1062b926d00d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.228:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:24:23 crc kubenswrapper[4756]: I0318 14:24:23.864511 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:24:24 crc kubenswrapper[4756]: I0318 14:24:24.394415 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:24 crc kubenswrapper[4756]: I0318 14:24:24.460689 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9","Type":"ContainerStarted","Data":"af363481ec0f440deab45c1a680cf5054a3297d749c4ca8d24249a6d4fefc4af"} Mar 18 14:24:25 crc kubenswrapper[4756]: I0318 14:24:25.327384 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed2296fb-bd22-4c18-9ecb-7c6ad419b336" path="/var/lib/kubelet/pods/ed2296fb-bd22-4c18-9ecb-7c6ad419b336/volumes" Mar 18 14:24:25 crc kubenswrapper[4756]: I0318 14:24:25.470037 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9","Type":"ContainerStarted","Data":"1214d5a5db6b7d371dc8e91555eb113a55a2d2e965c4e6ca04189e6c795c613f"} Mar 18 14:24:26 crc kubenswrapper[4756]: I0318 14:24:26.482746 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9","Type":"ContainerStarted","Data":"629b771b7faec5b34a480c76366e6f35e6af6898a8a25fdd8633885cb0025e8b"} Mar 18 14:24:27 crc kubenswrapper[4756]: I0318 14:24:27.499533 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9","Type":"ContainerStarted","Data":"a1e796c0cf47bb2f0dbb4e82151e3a9a258275052ab13de075d0458cd87cd093"} Mar 18 14:24:28 crc kubenswrapper[4756]: I0318 14:24:28.095961 4756 scope.go:117] "RemoveContainer" containerID="46affe02979f6a10d58e923caec0ceede3e7aef28c406ac6c18fac7f02e67a78" Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.539963 4756 generic.go:334] "Generic (PLEG): container finished" podID="b97ce503-7791-4a3c-b6b4-2dbfaccc94a1" containerID="920a6c13e95d8c98dca0033b3081f9edb413d45f4ac0725ae17af9b5665cff48" exitCode=137 Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.540646 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1","Type":"ContainerDied","Data":"920a6c13e95d8c98dca0033b3081f9edb413d45f4ac0725ae17af9b5665cff48"} Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.544322 4756 generic.go:334] "Generic (PLEG): container finished" podID="95f072df-7757-4f69-a289-b75f17b122ad" containerID="139dd49e881b7cb92a14d47495e0b4995a8a93c1b3d1d83d36bd8f33055088a2" exitCode=137 Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.544394 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95f072df-7757-4f69-a289-b75f17b122ad","Type":"ContainerDied","Data":"139dd49e881b7cb92a14d47495e0b4995a8a93c1b3d1d83d36bd8f33055088a2"} Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.546336 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9","Type":"ContainerStarted","Data":"93f9e394cdad1ab6bef8c1c9549801d13f8d0122f7cdf372e0534fc7abc728ca"} Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.547644 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.587677 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.325585587 podStartE2EDuration="6.587639468s" podCreationTimestamp="2026-03-18 14:24:23 +0000 UTC" firstStartedPulling="2026-03-18 14:24:24.39300505 +0000 UTC m=+1465.707423015" lastFinishedPulling="2026-03-18 14:24:28.655058921 +0000 UTC m=+1469.969476896" observedRunningTime="2026-03-18 14:24:29.573528086 +0000 UTC m=+1470.887946061" watchObservedRunningTime="2026-03-18 14:24:29.587639468 +0000 UTC m=+1470.902057443" Mar 18 14:24:29 crc kubenswrapper[4756]: E0318 14:24:29.623511 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb97ce503_7791_4a3c_b6b4_2dbfaccc94a1.slice/crio-920a6c13e95d8c98dca0033b3081f9edb413d45f4ac0725ae17af9b5665cff48.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb97ce503_7791_4a3c_b6b4_2dbfaccc94a1.slice/crio-conmon-920a6c13e95d8c98dca0033b3081f9edb413d45f4ac0725ae17af9b5665cff48.scope\": RecentStats: unable to find data in memory cache]" Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.838469 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.891356 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.956267 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f072df-7757-4f69-a289-b75f17b122ad-combined-ca-bundle\") pod \"95f072df-7757-4f69-a289-b75f17b122ad\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.956546 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f072df-7757-4f69-a289-b75f17b122ad-config-data\") pod \"95f072df-7757-4f69-a289-b75f17b122ad\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.956575 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95f072df-7757-4f69-a289-b75f17b122ad-logs\") pod \"95f072df-7757-4f69-a289-b75f17b122ad\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.956597 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klzp5\" (UniqueName: \"kubernetes.io/projected/95f072df-7757-4f69-a289-b75f17b122ad-kube-api-access-klzp5\") pod \"95f072df-7757-4f69-a289-b75f17b122ad\" (UID: \"95f072df-7757-4f69-a289-b75f17b122ad\") " Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.970660 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95f072df-7757-4f69-a289-b75f17b122ad-logs" (OuterVolumeSpecName: "logs") pod "95f072df-7757-4f69-a289-b75f17b122ad" (UID: "95f072df-7757-4f69-a289-b75f17b122ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:24:29 crc kubenswrapper[4756]: I0318 14:24:29.986250 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95f072df-7757-4f69-a289-b75f17b122ad-kube-api-access-klzp5" (OuterVolumeSpecName: "kube-api-access-klzp5") pod "95f072df-7757-4f69-a289-b75f17b122ad" (UID: "95f072df-7757-4f69-a289-b75f17b122ad"). InnerVolumeSpecName "kube-api-access-klzp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.011827 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f072df-7757-4f69-a289-b75f17b122ad-config-data" (OuterVolumeSpecName: "config-data") pod "95f072df-7757-4f69-a289-b75f17b122ad" (UID: "95f072df-7757-4f69-a289-b75f17b122ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.033858 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f072df-7757-4f69-a289-b75f17b122ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95f072df-7757-4f69-a289-b75f17b122ad" (UID: "95f072df-7757-4f69-a289-b75f17b122ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.058621 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f072df-7757-4f69-a289-b75f17b122ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.058651 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f072df-7757-4f69-a289-b75f17b122ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.058662 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95f072df-7757-4f69-a289-b75f17b122ad-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.058670 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klzp5\" (UniqueName: \"kubernetes.io/projected/95f072df-7757-4f69-a289-b75f17b122ad-kube-api-access-klzp5\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.301367 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.468020 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-config-data\") pod \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\" (UID: \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\") " Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.468173 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-combined-ca-bundle\") pod \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\" (UID: \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\") " Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.468244 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq9tr\" (UniqueName: \"kubernetes.io/projected/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-kube-api-access-zq9tr\") pod \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\" (UID: \"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1\") " Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.472534 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-kube-api-access-zq9tr" (OuterVolumeSpecName: "kube-api-access-zq9tr") pod "b97ce503-7791-4a3c-b6b4-2dbfaccc94a1" (UID: "b97ce503-7791-4a3c-b6b4-2dbfaccc94a1"). InnerVolumeSpecName "kube-api-access-zq9tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.497976 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-config-data" (OuterVolumeSpecName: "config-data") pod "b97ce503-7791-4a3c-b6b4-2dbfaccc94a1" (UID: "b97ce503-7791-4a3c-b6b4-2dbfaccc94a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.509589 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b97ce503-7791-4a3c-b6b4-2dbfaccc94a1" (UID: "b97ce503-7791-4a3c-b6b4-2dbfaccc94a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.557004 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b97ce503-7791-4a3c-b6b4-2dbfaccc94a1","Type":"ContainerDied","Data":"7fb95c7c1905a3539ce190068f7cd75f8fb7120054d6c0e02e6eab295464012c"} Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.557051 4756 scope.go:117] "RemoveContainer" containerID="920a6c13e95d8c98dca0033b3081f9edb413d45f4ac0725ae17af9b5665cff48" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.557187 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.561940 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.561993 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95f072df-7757-4f69-a289-b75f17b122ad","Type":"ContainerDied","Data":"dc3b1bc5ac592590dac1e325c3982d3831d5874e39861b94f0c1918f7636a309"} Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.571271 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.571311 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.571325 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq9tr\" (UniqueName: \"kubernetes.io/projected/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1-kube-api-access-zq9tr\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.584975 4756 scope.go:117] "RemoveContainer" containerID="139dd49e881b7cb92a14d47495e0b4995a8a93c1b3d1d83d36bd8f33055088a2" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.601222 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.610451 4756 scope.go:117] "RemoveContainer" containerID="5cc3858f0e33434acfab966cdbe709bf219600e383f27ba0bccf560cc6cc9592" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.619637 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.632230 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 14:24:30 crc kubenswrapper[4756]: E0318 14:24:30.632692 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f072df-7757-4f69-a289-b75f17b122ad" containerName="nova-metadata-metadata" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.632705 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f072df-7757-4f69-a289-b75f17b122ad" containerName="nova-metadata-metadata" Mar 18 14:24:30 crc kubenswrapper[4756]: E0318 14:24:30.632727 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f072df-7757-4f69-a289-b75f17b122ad" containerName="nova-metadata-log" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.632733 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f072df-7757-4f69-a289-b75f17b122ad" containerName="nova-metadata-log" Mar 18 14:24:30 crc kubenswrapper[4756]: E0318 14:24:30.632758 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97ce503-7791-4a3c-b6b4-2dbfaccc94a1" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.632765 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97ce503-7791-4a3c-b6b4-2dbfaccc94a1" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.632975 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f072df-7757-4f69-a289-b75f17b122ad" containerName="nova-metadata-log" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.632992 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b97ce503-7791-4a3c-b6b4-2dbfaccc94a1" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.633016 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f072df-7757-4f69-a289-b75f17b122ad" containerName="nova-metadata-metadata" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.633792 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.640708 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.640880 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.640980 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.648268 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.661055 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.683358 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.695173 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.696926 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.700612 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.701297 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.718673 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.732909 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.733314 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.775264 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g58qh\" (UniqueName: \"kubernetes.io/projected/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-kube-api-access-g58qh\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.775332 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.775367 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-logs\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.775411 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e6109-a0d3-4f6a-a790-c2e1725f635c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.775477 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.775519 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-config-data\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.775539 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4nmj\" (UniqueName: \"kubernetes.io/projected/e08e6109-a0d3-4f6a-a790-c2e1725f635c-kube-api-access-v4nmj\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.775565 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08e6109-a0d3-4f6a-a790-c2e1725f635c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.775590 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e6109-a0d3-4f6a-a790-c2e1725f635c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.775633 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08e6109-a0d3-4f6a-a790-c2e1725f635c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.877527 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.877616 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-config-data\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.877653 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4nmj\" (UniqueName: \"kubernetes.io/projected/e08e6109-a0d3-4f6a-a790-c2e1725f635c-kube-api-access-v4nmj\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.877686 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08e6109-a0d3-4f6a-a790-c2e1725f635c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.877721 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e6109-a0d3-4f6a-a790-c2e1725f635c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.877784 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08e6109-a0d3-4f6a-a790-c2e1725f635c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.877850 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g58qh\" (UniqueName: \"kubernetes.io/projected/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-kube-api-access-g58qh\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.877879 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.877899 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-logs\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.877934 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e6109-a0d3-4f6a-a790-c2e1725f635c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.878831 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-logs\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.882976 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08e6109-a0d3-4f6a-a790-c2e1725f635c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.883570 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e6109-a0d3-4f6a-a790-c2e1725f635c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.884078 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08e6109-a0d3-4f6a-a790-c2e1725f635c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.884382 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.884535 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-config-data\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.885791 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08e6109-a0d3-4f6a-a790-c2e1725f635c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.888534 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.903519 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g58qh\" (UniqueName: \"kubernetes.io/projected/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-kube-api-access-g58qh\") pod \"nova-metadata-0\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " pod="openstack/nova-metadata-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.909410 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4nmj\" (UniqueName: \"kubernetes.io/projected/e08e6109-a0d3-4f6a-a790-c2e1725f635c-kube-api-access-v4nmj\") pod \"nova-cell1-novncproxy-0\" (UID: \"e08e6109-a0d3-4f6a-a790-c2e1725f635c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:30 crc kubenswrapper[4756]: I0318 14:24:30.954960 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:31 crc kubenswrapper[4756]: I0318 14:24:31.017835 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 14:24:31 crc kubenswrapper[4756]: I0318 14:24:31.326498 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95f072df-7757-4f69-a289-b75f17b122ad" path="/var/lib/kubelet/pods/95f072df-7757-4f69-a289-b75f17b122ad/volumes" Mar 18 14:24:31 crc kubenswrapper[4756]: I0318 14:24:31.327332 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b97ce503-7791-4a3c-b6b4-2dbfaccc94a1" path="/var/lib/kubelet/pods/b97ce503-7791-4a3c-b6b4-2dbfaccc94a1/volumes" Mar 18 14:24:31 crc kubenswrapper[4756]: I0318 14:24:31.482212 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 14:24:31 crc kubenswrapper[4756]: I0318 14:24:31.614723 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e08e6109-a0d3-4f6a-a790-c2e1725f635c","Type":"ContainerStarted","Data":"1a0157964605dabeb8bfc6f6381e5ea98096e3d8241fc5f7d1e1f1552eaa4d06"} Mar 18 14:24:31 crc kubenswrapper[4756]: I0318 14:24:31.773014 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:24:32 crc kubenswrapper[4756]: I0318 14:24:32.623238 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e08e6109-a0d3-4f6a-a790-c2e1725f635c","Type":"ContainerStarted","Data":"8b43116f2f0127b7df4d8586d661f8f469e76178858af4b8d68617d6c1bfded1"} Mar 18 14:24:32 crc kubenswrapper[4756]: I0318 14:24:32.627933 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3","Type":"ContainerStarted","Data":"1cd38dc7d27a2683d688d561cae96bd617d30db38ab59c51082dfbe8585f365d"} Mar 18 14:24:32 crc kubenswrapper[4756]: I0318 14:24:32.627980 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3","Type":"ContainerStarted","Data":"9ea28543579c8a4a82ebf6cc16c9eaec55097eea58578b417baf645ca889972a"} Mar 18 14:24:32 crc kubenswrapper[4756]: I0318 14:24:32.627994 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3","Type":"ContainerStarted","Data":"d271eccbda9dbae407d585e2a4b88c1fdb9b02fc598e2ac27a80e23440213ca5"} Mar 18 14:24:32 crc kubenswrapper[4756]: I0318 14:24:32.647061 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.6470454610000003 podStartE2EDuration="2.647045461s" podCreationTimestamp="2026-03-18 14:24:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:24:32.643200667 +0000 UTC m=+1473.957618642" watchObservedRunningTime="2026-03-18 14:24:32.647045461 +0000 UTC m=+1473.961463436" Mar 18 14:24:32 crc kubenswrapper[4756]: I0318 14:24:32.670108 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.670088505 podStartE2EDuration="2.670088505s" podCreationTimestamp="2026-03-18 14:24:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:24:32.661851882 +0000 UTC m=+1473.976269857" watchObservedRunningTime="2026-03-18 14:24:32.670088505 +0000 UTC m=+1473.984506480" Mar 18 14:24:32 crc kubenswrapper[4756]: I0318 14:24:32.737303 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 14:24:32 crc kubenswrapper[4756]: I0318 14:24:32.737667 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 14:24:32 crc kubenswrapper[4756]: I0318 14:24:32.741478 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 14:24:33 crc kubenswrapper[4756]: I0318 14:24:33.640402 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 14:24:33 crc kubenswrapper[4756]: I0318 14:24:33.815825 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-t92ls"] Mar 18 14:24:33 crc kubenswrapper[4756]: I0318 14:24:33.817943 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:33 crc kubenswrapper[4756]: I0318 14:24:33.841437 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-t92ls"] Mar 18 14:24:33 crc kubenswrapper[4756]: I0318 14:24:33.952475 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:33 crc kubenswrapper[4756]: I0318 14:24:33.952542 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnxgx\" (UniqueName: \"kubernetes.io/projected/a6501734-2465-4925-8b50-4b7762bb9c4e-kube-api-access-gnxgx\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:33 crc kubenswrapper[4756]: I0318 14:24:33.952856 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:33 crc kubenswrapper[4756]: I0318 14:24:33.952976 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:33 crc kubenswrapper[4756]: I0318 14:24:33.953030 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-config\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:33 crc kubenswrapper[4756]: I0318 14:24:33.953331 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.055365 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.055422 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnxgx\" (UniqueName: \"kubernetes.io/projected/a6501734-2465-4925-8b50-4b7762bb9c4e-kube-api-access-gnxgx\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.055481 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.055519 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.055544 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-config\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.055612 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.056337 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.056409 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-config\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.058874 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.061634 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.061755 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.093060 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnxgx\" (UniqueName: \"kubernetes.io/projected/a6501734-2465-4925-8b50-4b7762bb9c4e-kube-api-access-gnxgx\") pod \"dnsmasq-dns-5fd9b586ff-t92ls\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.138000 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:34 crc kubenswrapper[4756]: I0318 14:24:34.708497 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-t92ls"] Mar 18 14:24:35 crc kubenswrapper[4756]: I0318 14:24:35.654528 4756 generic.go:334] "Generic (PLEG): container finished" podID="a6501734-2465-4925-8b50-4b7762bb9c4e" containerID="5798c74bf8a6bab35023505488160932c01cf866f900bc391dd7402af5a8b2e8" exitCode=0 Mar 18 14:24:35 crc kubenswrapper[4756]: I0318 14:24:35.654738 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" event={"ID":"a6501734-2465-4925-8b50-4b7762bb9c4e","Type":"ContainerDied","Data":"5798c74bf8a6bab35023505488160932c01cf866f900bc391dd7402af5a8b2e8"} Mar 18 14:24:35 crc kubenswrapper[4756]: I0318 14:24:35.654886 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" event={"ID":"a6501734-2465-4925-8b50-4b7762bb9c4e","Type":"ContainerStarted","Data":"8c858785a0a1cbe3e47863b0c4d02f4ed597bac0981c1ba7154779d1eee3268d"} Mar 18 14:24:35 crc kubenswrapper[4756]: I0318 14:24:35.955312 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:36 crc kubenswrapper[4756]: I0318 14:24:36.334701 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:24:36 crc kubenswrapper[4756]: I0318 14:24:36.664975 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" event={"ID":"a6501734-2465-4925-8b50-4b7762bb9c4e","Type":"ContainerStarted","Data":"f6d1c75a36cfe5377d574a829181cc49563d936b9123fdda423c340b1fc8f637"} Mar 18 14:24:36 crc kubenswrapper[4756]: I0318 14:24:36.665073 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9e6cd880-e90d-45aa-a0df-1062b926d00d" containerName="nova-api-log" containerID="cri-o://0fe85479583a1509af78f8d77bee8b1973bccad57fb0aae5c477a285b612f99f" gracePeriod=30 Mar 18 14:24:36 crc kubenswrapper[4756]: I0318 14:24:36.665204 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9e6cd880-e90d-45aa-a0df-1062b926d00d" containerName="nova-api-api" containerID="cri-o://625555a70f8ea9b9ae576b12ef7f09296f198b89b65c6e0fea52df5df4fd2517" gracePeriod=30 Mar 18 14:24:36 crc kubenswrapper[4756]: I0318 14:24:36.689180 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" podStartSLOduration=3.689165784 podStartE2EDuration="3.689165784s" podCreationTimestamp="2026-03-18 14:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:24:36.68602549 +0000 UTC m=+1478.000443455" watchObservedRunningTime="2026-03-18 14:24:36.689165784 +0000 UTC m=+1478.003583759" Mar 18 14:24:37 crc kubenswrapper[4756]: I0318 14:24:37.676679 4756 generic.go:334] "Generic (PLEG): container finished" podID="9e6cd880-e90d-45aa-a0df-1062b926d00d" containerID="0fe85479583a1509af78f8d77bee8b1973bccad57fb0aae5c477a285b612f99f" exitCode=143 Mar 18 14:24:37 crc kubenswrapper[4756]: I0318 14:24:37.676773 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9e6cd880-e90d-45aa-a0df-1062b926d00d","Type":"ContainerDied","Data":"0fe85479583a1509af78f8d77bee8b1973bccad57fb0aae5c477a285b612f99f"} Mar 18 14:24:37 crc kubenswrapper[4756]: I0318 14:24:37.677319 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:37 crc kubenswrapper[4756]: I0318 14:24:37.722563 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:37 crc kubenswrapper[4756]: I0318 14:24:37.722825 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="ceilometer-central-agent" containerID="cri-o://1214d5a5db6b7d371dc8e91555eb113a55a2d2e965c4e6ca04189e6c795c613f" gracePeriod=30 Mar 18 14:24:37 crc kubenswrapper[4756]: I0318 14:24:37.722903 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="proxy-httpd" containerID="cri-o://93f9e394cdad1ab6bef8c1c9549801d13f8d0122f7cdf372e0534fc7abc728ca" gracePeriod=30 Mar 18 14:24:37 crc kubenswrapper[4756]: I0318 14:24:37.722903 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="ceilometer-notification-agent" containerID="cri-o://629b771b7faec5b34a480c76366e6f35e6af6898a8a25fdd8633885cb0025e8b" gracePeriod=30 Mar 18 14:24:37 crc kubenswrapper[4756]: I0318 14:24:37.723208 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="sg-core" containerID="cri-o://a1e796c0cf47bb2f0dbb4e82151e3a9a258275052ab13de075d0458cd87cd093" gracePeriod=30 Mar 18 14:24:38 crc kubenswrapper[4756]: I0318 14:24:38.690908 4756 generic.go:334] "Generic (PLEG): container finished" podID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerID="93f9e394cdad1ab6bef8c1c9549801d13f8d0122f7cdf372e0534fc7abc728ca" exitCode=0 Mar 18 14:24:38 crc kubenswrapper[4756]: I0318 14:24:38.691263 4756 generic.go:334] "Generic (PLEG): container finished" podID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerID="a1e796c0cf47bb2f0dbb4e82151e3a9a258275052ab13de075d0458cd87cd093" exitCode=2 Mar 18 14:24:38 crc kubenswrapper[4756]: I0318 14:24:38.691271 4756 generic.go:334] "Generic (PLEG): container finished" podID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerID="629b771b7faec5b34a480c76366e6f35e6af6898a8a25fdd8633885cb0025e8b" exitCode=0 Mar 18 14:24:38 crc kubenswrapper[4756]: I0318 14:24:38.691278 4756 generic.go:334] "Generic (PLEG): container finished" podID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerID="1214d5a5db6b7d371dc8e91555eb113a55a2d2e965c4e6ca04189e6c795c613f" exitCode=0 Mar 18 14:24:38 crc kubenswrapper[4756]: I0318 14:24:38.691234 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9","Type":"ContainerDied","Data":"93f9e394cdad1ab6bef8c1c9549801d13f8d0122f7cdf372e0534fc7abc728ca"} Mar 18 14:24:38 crc kubenswrapper[4756]: I0318 14:24:38.691356 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9","Type":"ContainerDied","Data":"a1e796c0cf47bb2f0dbb4e82151e3a9a258275052ab13de075d0458cd87cd093"} Mar 18 14:24:38 crc kubenswrapper[4756]: I0318 14:24:38.691392 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9","Type":"ContainerDied","Data":"629b771b7faec5b34a480c76366e6f35e6af6898a8a25fdd8633885cb0025e8b"} Mar 18 14:24:38 crc kubenswrapper[4756]: I0318 14:24:38.691402 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9","Type":"ContainerDied","Data":"1214d5a5db6b7d371dc8e91555eb113a55a2d2e965c4e6ca04189e6c795c613f"} Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.015380 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.165709 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-sg-core-conf-yaml\") pod \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.166081 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-log-httpd\") pod \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.166238 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-ceilometer-tls-certs\") pod \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.166287 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-config-data\") pod \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.166343 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-run-httpd\") pod \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.166379 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-combined-ca-bundle\") pod \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.166446 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-scripts\") pod \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.166488 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqkpr\" (UniqueName: \"kubernetes.io/projected/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-kube-api-access-qqkpr\") pod \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\" (UID: \"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9\") " Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.166490 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" (UID: "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.166674 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" (UID: "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.166963 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.166981 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.173897 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-kube-api-access-qqkpr" (OuterVolumeSpecName: "kube-api-access-qqkpr") pod "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" (UID: "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9"). InnerVolumeSpecName "kube-api-access-qqkpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.177218 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-scripts" (OuterVolumeSpecName: "scripts") pod "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" (UID: "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.224006 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" (UID: "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.228895 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" (UID: "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.269698 4756 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.269729 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.269738 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqkpr\" (UniqueName: \"kubernetes.io/projected/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-kube-api-access-qqkpr\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.269748 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.319268 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-config-data" (OuterVolumeSpecName: "config-data") pod "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" (UID: "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.323402 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" (UID: "b83c6dbf-6207-47ad-bc91-2fbf6051cbe9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.371723 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.371765 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.702395 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b83c6dbf-6207-47ad-bc91-2fbf6051cbe9","Type":"ContainerDied","Data":"af363481ec0f440deab45c1a680cf5054a3297d749c4ca8d24249a6d4fefc4af"} Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.702705 4756 scope.go:117] "RemoveContainer" containerID="93f9e394cdad1ab6bef8c1c9549801d13f8d0122f7cdf372e0534fc7abc728ca" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.702444 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.732194 4756 scope.go:117] "RemoveContainer" containerID="a1e796c0cf47bb2f0dbb4e82151e3a9a258275052ab13de075d0458cd87cd093" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.758888 4756 scope.go:117] "RemoveContainer" containerID="629b771b7faec5b34a480c76366e6f35e6af6898a8a25fdd8633885cb0025e8b" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.759580 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.785362 4756 scope.go:117] "RemoveContainer" containerID="1214d5a5db6b7d371dc8e91555eb113a55a2d2e965c4e6ca04189e6c795c613f" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.805753 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.824555 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:39 crc kubenswrapper[4756]: E0318 14:24:39.825033 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="ceilometer-central-agent" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.825054 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="ceilometer-central-agent" Mar 18 14:24:39 crc kubenswrapper[4756]: E0318 14:24:39.825070 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="sg-core" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.825077 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="sg-core" Mar 18 14:24:39 crc kubenswrapper[4756]: E0318 14:24:39.825097 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="ceilometer-notification-agent" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.825102 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="ceilometer-notification-agent" Mar 18 14:24:39 crc kubenswrapper[4756]: E0318 14:24:39.825138 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="proxy-httpd" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.825146 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="proxy-httpd" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.825336 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="ceilometer-notification-agent" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.825353 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="proxy-httpd" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.825363 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="sg-core" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.825382 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" containerName="ceilometer-central-agent" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.827200 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.830587 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.831476 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.831685 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.839927 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.985271 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pc84\" (UniqueName: \"kubernetes.io/projected/9ae77056-644f-47f8-8c84-0dfa7dc9253c-kube-api-access-2pc84\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.985343 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.985411 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ae77056-644f-47f8-8c84-0dfa7dc9253c-run-httpd\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.985427 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-scripts\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.985457 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-config-data\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.985495 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.985521 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ae77056-644f-47f8-8c84-0dfa7dc9253c-log-httpd\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:39 crc kubenswrapper[4756]: I0318 14:24:39.985535 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.087624 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ae77056-644f-47f8-8c84-0dfa7dc9253c-run-httpd\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.088111 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-scripts\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.088267 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-config-data\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.088981 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.089354 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ae77056-644f-47f8-8c84-0dfa7dc9253c-log-httpd\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.089447 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.089604 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pc84\" (UniqueName: \"kubernetes.io/projected/9ae77056-644f-47f8-8c84-0dfa7dc9253c-kube-api-access-2pc84\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.089758 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.089665 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ae77056-644f-47f8-8c84-0dfa7dc9253c-log-httpd\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.088165 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ae77056-644f-47f8-8c84-0dfa7dc9253c-run-httpd\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.111744 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.112495 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-config-data\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.113610 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.114180 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-scripts\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.117793 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.130878 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pc84\" (UniqueName: \"kubernetes.io/projected/9ae77056-644f-47f8-8c84-0dfa7dc9253c-kube-api-access-2pc84\") pod \"ceilometer-0\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.149599 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.371951 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.718869 4756 generic.go:334] "Generic (PLEG): container finished" podID="9e6cd880-e90d-45aa-a0df-1062b926d00d" containerID="625555a70f8ea9b9ae576b12ef7f09296f198b89b65c6e0fea52df5df4fd2517" exitCode=0 Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.718915 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9e6cd880-e90d-45aa-a0df-1062b926d00d","Type":"ContainerDied","Data":"625555a70f8ea9b9ae576b12ef7f09296f198b89b65c6e0fea52df5df4fd2517"} Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.872009 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.955788 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:40 crc kubenswrapper[4756]: I0318 14:24:40.979308 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.018814 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.018856 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.234634 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.331022 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gghzv\" (UniqueName: \"kubernetes.io/projected/9e6cd880-e90d-45aa-a0df-1062b926d00d-kube-api-access-gghzv\") pod \"9e6cd880-e90d-45aa-a0df-1062b926d00d\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.331260 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e6cd880-e90d-45aa-a0df-1062b926d00d-logs\") pod \"9e6cd880-e90d-45aa-a0df-1062b926d00d\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.331371 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6cd880-e90d-45aa-a0df-1062b926d00d-combined-ca-bundle\") pod \"9e6cd880-e90d-45aa-a0df-1062b926d00d\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.331546 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e6cd880-e90d-45aa-a0df-1062b926d00d-config-data\") pod \"9e6cd880-e90d-45aa-a0df-1062b926d00d\" (UID: \"9e6cd880-e90d-45aa-a0df-1062b926d00d\") " Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.331875 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e6cd880-e90d-45aa-a0df-1062b926d00d-logs" (OuterVolumeSpecName: "logs") pod "9e6cd880-e90d-45aa-a0df-1062b926d00d" (UID: "9e6cd880-e90d-45aa-a0df-1062b926d00d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.332245 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e6cd880-e90d-45aa-a0df-1062b926d00d-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.334019 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b83c6dbf-6207-47ad-bc91-2fbf6051cbe9" path="/var/lib/kubelet/pods/b83c6dbf-6207-47ad-bc91-2fbf6051cbe9/volumes" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.336802 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e6cd880-e90d-45aa-a0df-1062b926d00d-kube-api-access-gghzv" (OuterVolumeSpecName: "kube-api-access-gghzv") pod "9e6cd880-e90d-45aa-a0df-1062b926d00d" (UID: "9e6cd880-e90d-45aa-a0df-1062b926d00d"). InnerVolumeSpecName "kube-api-access-gghzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.366376 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6cd880-e90d-45aa-a0df-1062b926d00d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e6cd880-e90d-45aa-a0df-1062b926d00d" (UID: "9e6cd880-e90d-45aa-a0df-1062b926d00d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.382651 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6cd880-e90d-45aa-a0df-1062b926d00d-config-data" (OuterVolumeSpecName: "config-data") pod "9e6cd880-e90d-45aa-a0df-1062b926d00d" (UID: "9e6cd880-e90d-45aa-a0df-1062b926d00d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.434503 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gghzv\" (UniqueName: \"kubernetes.io/projected/9e6cd880-e90d-45aa-a0df-1062b926d00d-kube-api-access-gghzv\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.434721 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6cd880-e90d-45aa-a0df-1062b926d00d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.434731 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e6cd880-e90d-45aa-a0df-1062b926d00d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.737556 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.738892 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9e6cd880-e90d-45aa-a0df-1062b926d00d","Type":"ContainerDied","Data":"61d1e3031e03ea8d9ad004cd1e414c5b5ff2e26e47e425bd33501e072dca8b7f"} Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.738940 4756 scope.go:117] "RemoveContainer" containerID="625555a70f8ea9b9ae576b12ef7f09296f198b89b65c6e0fea52df5df4fd2517" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.744448 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ae77056-644f-47f8-8c84-0dfa7dc9253c","Type":"ContainerStarted","Data":"137f237f2d10720025363ff80e2916fce11bc7ebdb45029d7c3353bc1220d752"} Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.744482 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ae77056-644f-47f8-8c84-0dfa7dc9253c","Type":"ContainerStarted","Data":"024a096b572ec5f3116476117df2599c7ceca87d027f6aba68c8c7d61a2893d3"} Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.773164 4756 scope.go:117] "RemoveContainer" containerID="0fe85479583a1509af78f8d77bee8b1973bccad57fb0aae5c477a285b612f99f" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.776990 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.779949 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.792832 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.810102 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 14:24:41 crc kubenswrapper[4756]: E0318 14:24:41.810588 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e6cd880-e90d-45aa-a0df-1062b926d00d" containerName="nova-api-log" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.810605 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6cd880-e90d-45aa-a0df-1062b926d00d" containerName="nova-api-log" Mar 18 14:24:41 crc kubenswrapper[4756]: E0318 14:24:41.810614 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e6cd880-e90d-45aa-a0df-1062b926d00d" containerName="nova-api-api" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.810621 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6cd880-e90d-45aa-a0df-1062b926d00d" containerName="nova-api-api" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.810999 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e6cd880-e90d-45aa-a0df-1062b926d00d" containerName="nova-api-api" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.812690 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e6cd880-e90d-45aa-a0df-1062b926d00d" containerName="nova-api-log" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.813807 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.818390 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.821437 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.831317 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.832763 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.944910 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-config-data\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.945030 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.945104 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.945139 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb8jn\" (UniqueName: \"kubernetes.io/projected/246a5313-e9e8-4819-a58c-65dbc141ea7f-kube-api-access-tb8jn\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.945172 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/246a5313-e9e8-4819-a58c-65dbc141ea7f-logs\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.945219 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-public-tls-certs\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.970789 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-fttqt"] Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.972067 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.973917 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.974127 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 14:24:41 crc kubenswrapper[4756]: I0318 14:24:41.980301 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fttqt"] Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.030512 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.232:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.030570 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.232:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.046820 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-config-data\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.047081 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.047246 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.047315 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb8jn\" (UniqueName: \"kubernetes.io/projected/246a5313-e9e8-4819-a58c-65dbc141ea7f-kube-api-access-tb8jn\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.047386 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/246a5313-e9e8-4819-a58c-65dbc141ea7f-logs\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.047499 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-public-tls-certs\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.050083 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/246a5313-e9e8-4819-a58c-65dbc141ea7f-logs\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.060645 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.061033 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-public-tls-certs\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.061335 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-config-data\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.061531 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.068650 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb8jn\" (UniqueName: \"kubernetes.io/projected/246a5313-e9e8-4819-a58c-65dbc141ea7f-kube-api-access-tb8jn\") pod \"nova-api-0\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " pod="openstack/nova-api-0" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.129323 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.149248 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-config-data\") pod \"nova-cell1-cell-mapping-fttqt\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.149454 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-scripts\") pod \"nova-cell1-cell-mapping-fttqt\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.149541 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gblv\" (UniqueName: \"kubernetes.io/projected/2217a20d-e435-4926-a713-89fc852aab36-kube-api-access-8gblv\") pod \"nova-cell1-cell-mapping-fttqt\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.149595 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fttqt\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.251741 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-scripts\") pod \"nova-cell1-cell-mapping-fttqt\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.251824 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gblv\" (UniqueName: \"kubernetes.io/projected/2217a20d-e435-4926-a713-89fc852aab36-kube-api-access-8gblv\") pod \"nova-cell1-cell-mapping-fttqt\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.251866 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fttqt\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.251895 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-config-data\") pod \"nova-cell1-cell-mapping-fttqt\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.262788 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-scripts\") pod \"nova-cell1-cell-mapping-fttqt\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.272809 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-config-data\") pod \"nova-cell1-cell-mapping-fttqt\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.278197 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fttqt\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.286229 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gblv\" (UniqueName: \"kubernetes.io/projected/2217a20d-e435-4926-a713-89fc852aab36-kube-api-access-8gblv\") pod \"nova-cell1-cell-mapping-fttqt\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.289579 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.706916 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.824339 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ae77056-644f-47f8-8c84-0dfa7dc9253c","Type":"ContainerStarted","Data":"1a48e84784bf58cbd63a2fc9cd75842af8bf8f5ed0cdab6e1f76e7d2f169ce91"} Mar 18 14:24:42 crc kubenswrapper[4756]: I0318 14:24:42.840143 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"246a5313-e9e8-4819-a58c-65dbc141ea7f","Type":"ContainerStarted","Data":"911c71a9a924ed2f4fc2de67d21150ab42d959575bd35c5d0bb136f46068b8cc"} Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.051183 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vkw76"] Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.053578 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:24:43 crc kubenswrapper[4756]: W0318 14:24:43.069252 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2217a20d_e435_4926_a713_89fc852aab36.slice/crio-accb9ce6bf5098a9a0863826e833b9cdc39243534450b1f263affbf5c850d665 WatchSource:0}: Error finding container accb9ce6bf5098a9a0863826e833b9cdc39243534450b1f263affbf5c850d665: Status 404 returned error can't find the container with id accb9ce6bf5098a9a0863826e833b9cdc39243534450b1f263affbf5c850d665 Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.070848 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vkw76"] Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.089837 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fttqt"] Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.177344 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea224-eb93-4f10-95e8-a1310e31b70f-utilities\") pod \"redhat-operators-vkw76\" (UID: \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\") " pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.177455 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfg4l\" (UniqueName: \"kubernetes.io/projected/3b2ea224-eb93-4f10-95e8-a1310e31b70f-kube-api-access-cfg4l\") pod \"redhat-operators-vkw76\" (UID: \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\") " pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.177522 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea224-eb93-4f10-95e8-a1310e31b70f-catalog-content\") pod \"redhat-operators-vkw76\" (UID: \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\") " pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.280449 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea224-eb93-4f10-95e8-a1310e31b70f-utilities\") pod \"redhat-operators-vkw76\" (UID: \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\") " pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.280543 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfg4l\" (UniqueName: \"kubernetes.io/projected/3b2ea224-eb93-4f10-95e8-a1310e31b70f-kube-api-access-cfg4l\") pod \"redhat-operators-vkw76\" (UID: \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\") " pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.280600 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea224-eb93-4f10-95e8-a1310e31b70f-catalog-content\") pod \"redhat-operators-vkw76\" (UID: \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\") " pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.280901 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea224-eb93-4f10-95e8-a1310e31b70f-utilities\") pod \"redhat-operators-vkw76\" (UID: \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\") " pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.281002 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea224-eb93-4f10-95e8-a1310e31b70f-catalog-content\") pod \"redhat-operators-vkw76\" (UID: \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\") " pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.310030 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfg4l\" (UniqueName: \"kubernetes.io/projected/3b2ea224-eb93-4f10-95e8-a1310e31b70f-kube-api-access-cfg4l\") pod \"redhat-operators-vkw76\" (UID: \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\") " pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.330814 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e6cd880-e90d-45aa-a0df-1062b926d00d" path="/var/lib/kubelet/pods/9e6cd880-e90d-45aa-a0df-1062b926d00d/volumes" Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.381827 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.864823 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"246a5313-e9e8-4819-a58c-65dbc141ea7f","Type":"ContainerStarted","Data":"80c7c33720a7b2791aab6c4dadb2da291d712365968b3bf32d7b4ea985b2d65e"} Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.865299 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"246a5313-e9e8-4819-a58c-65dbc141ea7f","Type":"ContainerStarted","Data":"f13f5c30666cb5ccf26e0a13f5fa3a5486beda45288b824fed18830bdff20098"} Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.871716 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fttqt" event={"ID":"2217a20d-e435-4926-a713-89fc852aab36","Type":"ContainerStarted","Data":"c4f0ba2ad926a259b9adfdf94ebd5d19b559540676b8757e2e18ee3567425782"} Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.871748 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fttqt" event={"ID":"2217a20d-e435-4926-a713-89fc852aab36","Type":"ContainerStarted","Data":"accb9ce6bf5098a9a0863826e833b9cdc39243534450b1f263affbf5c850d665"} Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.905679 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.905663201 podStartE2EDuration="2.905663201s" podCreationTimestamp="2026-03-18 14:24:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:24:43.891909619 +0000 UTC m=+1485.206327594" watchObservedRunningTime="2026-03-18 14:24:43.905663201 +0000 UTC m=+1485.220081176" Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.922301 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-fttqt" podStartSLOduration=2.922285511 podStartE2EDuration="2.922285511s" podCreationTimestamp="2026-03-18 14:24:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:24:43.91556961 +0000 UTC m=+1485.229987605" watchObservedRunningTime="2026-03-18 14:24:43.922285511 +0000 UTC m=+1485.236703486" Mar 18 14:24:43 crc kubenswrapper[4756]: I0318 14:24:43.972436 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vkw76"] Mar 18 14:24:44 crc kubenswrapper[4756]: I0318 14:24:44.140290 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:24:44 crc kubenswrapper[4756]: I0318 14:24:44.233190 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-ns4cg"] Mar 18 14:24:44 crc kubenswrapper[4756]: I0318 14:24:44.233682 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cd565959-ns4cg" podUID="dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" containerName="dnsmasq-dns" containerID="cri-o://c6348806d3190a6708fd6c47a6e91096b9409a436f6d9a0896df2b66fad90a12" gracePeriod=10 Mar 18 14:24:44 crc kubenswrapper[4756]: I0318 14:24:44.889872 4756 generic.go:334] "Generic (PLEG): container finished" podID="3b2ea224-eb93-4f10-95e8-a1310e31b70f" containerID="26d5fbb28b1ab434c8352ffffbb19953a7c2d2cdd7935a5375e8f9b7f75816ff" exitCode=0 Mar 18 14:24:44 crc kubenswrapper[4756]: I0318 14:24:44.889942 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkw76" event={"ID":"3b2ea224-eb93-4f10-95e8-a1310e31b70f","Type":"ContainerDied","Data":"26d5fbb28b1ab434c8352ffffbb19953a7c2d2cdd7935a5375e8f9b7f75816ff"} Mar 18 14:24:44 crc kubenswrapper[4756]: I0318 14:24:44.889965 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkw76" event={"ID":"3b2ea224-eb93-4f10-95e8-a1310e31b70f","Type":"ContainerStarted","Data":"eeeba384b65df231afe660dafae83a84deda2bd0f77b66c8b74a6a130ff29096"} Mar 18 14:24:44 crc kubenswrapper[4756]: I0318 14:24:44.897522 4756 generic.go:334] "Generic (PLEG): container finished" podID="dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" containerID="c6348806d3190a6708fd6c47a6e91096b9409a436f6d9a0896df2b66fad90a12" exitCode=0 Mar 18 14:24:44 crc kubenswrapper[4756]: I0318 14:24:44.897837 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-ns4cg" event={"ID":"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a","Type":"ContainerDied","Data":"c6348806d3190a6708fd6c47a6e91096b9409a436f6d9a0896df2b66fad90a12"} Mar 18 14:24:44 crc kubenswrapper[4756]: I0318 14:24:44.904142 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ae77056-644f-47f8-8c84-0dfa7dc9253c","Type":"ContainerStarted","Data":"9aefd73dcd58ea52819094f7d0cccb4d9e4f3c4106072421e84664a1728ccc8b"} Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.118002 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.240755 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvczs\" (UniqueName: \"kubernetes.io/projected/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-kube-api-access-rvczs\") pod \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.240939 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-dns-svc\") pod \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.240998 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-ovsdbserver-nb\") pod \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.241040 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-config\") pod \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.241074 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-dns-swift-storage-0\") pod \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.241106 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-ovsdbserver-sb\") pod \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\" (UID: \"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a\") " Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.275415 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-kube-api-access-rvczs" (OuterVolumeSpecName: "kube-api-access-rvczs") pod "dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" (UID: "dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a"). InnerVolumeSpecName "kube-api-access-rvczs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.343314 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvczs\" (UniqueName: \"kubernetes.io/projected/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-kube-api-access-rvczs\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.437792 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" (UID: "dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.438265 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" (UID: "dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.438546 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" (UID: "dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.445031 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.445053 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.445064 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.446481 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" (UID: "dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.471357 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-config" (OuterVolumeSpecName: "config") pod "dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" (UID: "dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.546587 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.546619 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.915007 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-ns4cg" event={"ID":"dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a","Type":"ContainerDied","Data":"271b33457d9b61f32faa1893f63b97cadcb8ab9fdf1eb59649145f1db533fdf9"} Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.915057 4756 scope.go:117] "RemoveContainer" containerID="c6348806d3190a6708fd6c47a6e91096b9409a436f6d9a0896df2b66fad90a12" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.915067 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-ns4cg" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.935079 4756 scope.go:117] "RemoveContainer" containerID="0566d276433c3b7f728bf701e3b74cae247edc93d6211bacbb0ec1ac3f66f28e" Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.960162 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-ns4cg"] Mar 18 14:24:45 crc kubenswrapper[4756]: I0318 14:24:45.978548 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-ns4cg"] Mar 18 14:24:46 crc kubenswrapper[4756]: I0318 14:24:46.934016 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkw76" event={"ID":"3b2ea224-eb93-4f10-95e8-a1310e31b70f","Type":"ContainerStarted","Data":"62f71fad038d5624ac73c768dcb7e916ae0deeadd96b9c071097a2c1e2b2aeb6"} Mar 18 14:24:47 crc kubenswrapper[4756]: I0318 14:24:47.329340 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" path="/var/lib/kubelet/pods/dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a/volumes" Mar 18 14:24:47 crc kubenswrapper[4756]: I0318 14:24:47.948270 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ae77056-644f-47f8-8c84-0dfa7dc9253c","Type":"ContainerStarted","Data":"41f9fc0c42655ee1aca2ee8336bb8a176370e42def7a8709df39dac27aeef4c6"} Mar 18 14:24:47 crc kubenswrapper[4756]: I0318 14:24:47.949418 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="ceilometer-central-agent" containerID="cri-o://137f237f2d10720025363ff80e2916fce11bc7ebdb45029d7c3353bc1220d752" gracePeriod=30 Mar 18 14:24:47 crc kubenswrapper[4756]: I0318 14:24:47.949481 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="ceilometer-notification-agent" containerID="cri-o://1a48e84784bf58cbd63a2fc9cd75842af8bf8f5ed0cdab6e1f76e7d2f169ce91" gracePeriod=30 Mar 18 14:24:47 crc kubenswrapper[4756]: I0318 14:24:47.949527 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="proxy-httpd" containerID="cri-o://41f9fc0c42655ee1aca2ee8336bb8a176370e42def7a8709df39dac27aeef4c6" gracePeriod=30 Mar 18 14:24:47 crc kubenswrapper[4756]: I0318 14:24:47.949481 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="sg-core" containerID="cri-o://9aefd73dcd58ea52819094f7d0cccb4d9e4f3c4106072421e84664a1728ccc8b" gracePeriod=30 Mar 18 14:24:47 crc kubenswrapper[4756]: I0318 14:24:47.992380 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.1528859479999998 podStartE2EDuration="8.992358212s" podCreationTimestamp="2026-03-18 14:24:39 +0000 UTC" firstStartedPulling="2026-03-18 14:24:40.916043765 +0000 UTC m=+1482.230461740" lastFinishedPulling="2026-03-18 14:24:46.755516029 +0000 UTC m=+1488.069934004" observedRunningTime="2026-03-18 14:24:47.974640143 +0000 UTC m=+1489.289058118" watchObservedRunningTime="2026-03-18 14:24:47.992358212 +0000 UTC m=+1489.306776187" Mar 18 14:24:48 crc kubenswrapper[4756]: I0318 14:24:48.963435 4756 generic.go:334] "Generic (PLEG): container finished" podID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerID="41f9fc0c42655ee1aca2ee8336bb8a176370e42def7a8709df39dac27aeef4c6" exitCode=0 Mar 18 14:24:48 crc kubenswrapper[4756]: I0318 14:24:48.963672 4756 generic.go:334] "Generic (PLEG): container finished" podID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerID="9aefd73dcd58ea52819094f7d0cccb4d9e4f3c4106072421e84664a1728ccc8b" exitCode=2 Mar 18 14:24:48 crc kubenswrapper[4756]: I0318 14:24:48.963685 4756 generic.go:334] "Generic (PLEG): container finished" podID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerID="1a48e84784bf58cbd63a2fc9cd75842af8bf8f5ed0cdab6e1f76e7d2f169ce91" exitCode=0 Mar 18 14:24:48 crc kubenswrapper[4756]: I0318 14:24:48.963654 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ae77056-644f-47f8-8c84-0dfa7dc9253c","Type":"ContainerDied","Data":"41f9fc0c42655ee1aca2ee8336bb8a176370e42def7a8709df39dac27aeef4c6"} Mar 18 14:24:48 crc kubenswrapper[4756]: I0318 14:24:48.963721 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ae77056-644f-47f8-8c84-0dfa7dc9253c","Type":"ContainerDied","Data":"9aefd73dcd58ea52819094f7d0cccb4d9e4f3c4106072421e84664a1728ccc8b"} Mar 18 14:24:48 crc kubenswrapper[4756]: I0318 14:24:48.963734 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ae77056-644f-47f8-8c84-0dfa7dc9253c","Type":"ContainerDied","Data":"1a48e84784bf58cbd63a2fc9cd75842af8bf8f5ed0cdab6e1f76e7d2f169ce91"} Mar 18 14:24:49 crc kubenswrapper[4756]: I0318 14:24:49.018404 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 14:24:49 crc kubenswrapper[4756]: I0318 14:24:49.018465 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 14:24:49 crc kubenswrapper[4756]: I0318 14:24:49.993729 4756 generic.go:334] "Generic (PLEG): container finished" podID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerID="137f237f2d10720025363ff80e2916fce11bc7ebdb45029d7c3353bc1220d752" exitCode=0 Mar 18 14:24:49 crc kubenswrapper[4756]: I0318 14:24:49.993896 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ae77056-644f-47f8-8c84-0dfa7dc9253c","Type":"ContainerDied","Data":"137f237f2d10720025363ff80e2916fce11bc7ebdb45029d7c3353bc1220d752"} Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.094674 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.243266 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-ceilometer-tls-certs\") pod \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.243612 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ae77056-644f-47f8-8c84-0dfa7dc9253c-run-httpd\") pod \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.243794 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-scripts\") pod \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.243883 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pc84\" (UniqueName: \"kubernetes.io/projected/9ae77056-644f-47f8-8c84-0dfa7dc9253c-kube-api-access-2pc84\") pod \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.243911 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-config-data\") pod \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.243965 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-sg-core-conf-yaml\") pod \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.243991 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ae77056-644f-47f8-8c84-0dfa7dc9253c-log-httpd\") pod \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.244010 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-combined-ca-bundle\") pod \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\" (UID: \"9ae77056-644f-47f8-8c84-0dfa7dc9253c\") " Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.244315 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ae77056-644f-47f8-8c84-0dfa7dc9253c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9ae77056-644f-47f8-8c84-0dfa7dc9253c" (UID: "9ae77056-644f-47f8-8c84-0dfa7dc9253c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.245876 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ae77056-644f-47f8-8c84-0dfa7dc9253c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9ae77056-644f-47f8-8c84-0dfa7dc9253c" (UID: "9ae77056-644f-47f8-8c84-0dfa7dc9253c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.255741 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae77056-644f-47f8-8c84-0dfa7dc9253c-kube-api-access-2pc84" (OuterVolumeSpecName: "kube-api-access-2pc84") pod "9ae77056-644f-47f8-8c84-0dfa7dc9253c" (UID: "9ae77056-644f-47f8-8c84-0dfa7dc9253c"). InnerVolumeSpecName "kube-api-access-2pc84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.255869 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ae77056-644f-47f8-8c84-0dfa7dc9253c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.255910 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ae77056-644f-47f8-8c84-0dfa7dc9253c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.258250 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-scripts" (OuterVolumeSpecName: "scripts") pod "9ae77056-644f-47f8-8c84-0dfa7dc9253c" (UID: "9ae77056-644f-47f8-8c84-0dfa7dc9253c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.297066 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9ae77056-644f-47f8-8c84-0dfa7dc9253c" (UID: "9ae77056-644f-47f8-8c84-0dfa7dc9253c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.310755 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9ae77056-644f-47f8-8c84-0dfa7dc9253c" (UID: "9ae77056-644f-47f8-8c84-0dfa7dc9253c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.341561 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ae77056-644f-47f8-8c84-0dfa7dc9253c" (UID: "9ae77056-644f-47f8-8c84-0dfa7dc9253c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.357365 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pc84\" (UniqueName: \"kubernetes.io/projected/9ae77056-644f-47f8-8c84-0dfa7dc9253c-kube-api-access-2pc84\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.357398 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.357407 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.357415 4756 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.357423 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.371197 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-config-data" (OuterVolumeSpecName: "config-data") pod "9ae77056-644f-47f8-8c84-0dfa7dc9253c" (UID: "9ae77056-644f-47f8-8c84-0dfa7dc9253c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:50 crc kubenswrapper[4756]: I0318 14:24:50.460734 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae77056-644f-47f8-8c84-0dfa7dc9253c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.011304 4756 generic.go:334] "Generic (PLEG): container finished" podID="2217a20d-e435-4926-a713-89fc852aab36" containerID="c4f0ba2ad926a259b9adfdf94ebd5d19b559540676b8757e2e18ee3567425782" exitCode=0 Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.011395 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fttqt" event={"ID":"2217a20d-e435-4926-a713-89fc852aab36","Type":"ContainerDied","Data":"c4f0ba2ad926a259b9adfdf94ebd5d19b559540676b8757e2e18ee3567425782"} Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.016663 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ae77056-644f-47f8-8c84-0dfa7dc9253c","Type":"ContainerDied","Data":"024a096b572ec5f3116476117df2599c7ceca87d027f6aba68c8c7d61a2893d3"} Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.016708 4756 scope.go:117] "RemoveContainer" containerID="41f9fc0c42655ee1aca2ee8336bb8a176370e42def7a8709df39dac27aeef4c6" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.016715 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.030463 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.032824 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.044358 4756 scope.go:117] "RemoveContainer" containerID="9aefd73dcd58ea52819094f7d0cccb4d9e4f3c4106072421e84664a1728ccc8b" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.051223 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.081405 4756 scope.go:117] "RemoveContainer" containerID="1a48e84784bf58cbd63a2fc9cd75842af8bf8f5ed0cdab6e1f76e7d2f169ce91" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.108298 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.117767 4756 scope.go:117] "RemoveContainer" containerID="137f237f2d10720025363ff80e2916fce11bc7ebdb45029d7c3353bc1220d752" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.128512 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.143164 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:51 crc kubenswrapper[4756]: E0318 14:24:51.143551 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" containerName="dnsmasq-dns" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.143567 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" containerName="dnsmasq-dns" Mar 18 14:24:51 crc kubenswrapper[4756]: E0318 14:24:51.143581 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="proxy-httpd" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.143588 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="proxy-httpd" Mar 18 14:24:51 crc kubenswrapper[4756]: E0318 14:24:51.143596 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" containerName="init" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.143602 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" containerName="init" Mar 18 14:24:51 crc kubenswrapper[4756]: E0318 14:24:51.143628 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="ceilometer-notification-agent" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.143634 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="ceilometer-notification-agent" Mar 18 14:24:51 crc kubenswrapper[4756]: E0318 14:24:51.143652 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="sg-core" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.143659 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="sg-core" Mar 18 14:24:51 crc kubenswrapper[4756]: E0318 14:24:51.143671 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="ceilometer-central-agent" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.143677 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="ceilometer-central-agent" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.143847 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbe49ac8-2f68-44cb-8f4b-e5c11ffd021a" containerName="dnsmasq-dns" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.144951 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="ceilometer-central-agent" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.144986 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="proxy-httpd" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.144998 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="ceilometer-notification-agent" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.145005 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" containerName="sg-core" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.147264 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.150689 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.155025 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.155054 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.155268 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.276964 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c898ac99-8bd7-467f-8540-0b23e255b8a0-log-httpd\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.277017 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-scripts\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.277042 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.277113 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.277150 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.277216 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c898ac99-8bd7-467f-8540-0b23e255b8a0-run-httpd\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.277243 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-config-data\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.277269 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7k4m\" (UniqueName: \"kubernetes.io/projected/c898ac99-8bd7-467f-8540-0b23e255b8a0-kube-api-access-t7k4m\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.325620 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ae77056-644f-47f8-8c84-0dfa7dc9253c" path="/var/lib/kubelet/pods/9ae77056-644f-47f8-8c84-0dfa7dc9253c/volumes" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.379203 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.379246 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.379321 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c898ac99-8bd7-467f-8540-0b23e255b8a0-run-httpd\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.379346 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-config-data\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.379365 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7k4m\" (UniqueName: \"kubernetes.io/projected/c898ac99-8bd7-467f-8540-0b23e255b8a0-kube-api-access-t7k4m\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.379423 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c898ac99-8bd7-467f-8540-0b23e255b8a0-log-httpd\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.379443 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-scripts\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.379463 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.380667 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c898ac99-8bd7-467f-8540-0b23e255b8a0-log-httpd\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.382964 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c898ac99-8bd7-467f-8540-0b23e255b8a0-run-httpd\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.386809 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-scripts\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.387623 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.388357 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-config-data\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.388389 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.388475 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.402497 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7k4m\" (UniqueName: \"kubernetes.io/projected/c898ac99-8bd7-467f-8540-0b23e255b8a0-kube-api-access-t7k4m\") pod \"ceilometer-0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.467647 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:24:51 crc kubenswrapper[4756]: I0318 14:24:51.968467 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:24:52 crc kubenswrapper[4756]: I0318 14:24:52.026658 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c898ac99-8bd7-467f-8540-0b23e255b8a0","Type":"ContainerStarted","Data":"aeb7174ec7d58ed04c25ee870da85d041b9dd65b9699d1673b5ef50da5b13817"} Mar 18 14:24:52 crc kubenswrapper[4756]: I0318 14:24:52.039634 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 14:24:52 crc kubenswrapper[4756]: I0318 14:24:52.130460 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 14:24:52 crc kubenswrapper[4756]: I0318 14:24:52.130505 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 14:24:52 crc kubenswrapper[4756]: I0318 14:24:52.812373 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:52 crc kubenswrapper[4756]: I0318 14:24:52.910497 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-config-data\") pod \"2217a20d-e435-4926-a713-89fc852aab36\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " Mar 18 14:24:52 crc kubenswrapper[4756]: I0318 14:24:52.910554 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-combined-ca-bundle\") pod \"2217a20d-e435-4926-a713-89fc852aab36\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " Mar 18 14:24:52 crc kubenswrapper[4756]: I0318 14:24:52.910617 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-scripts\") pod \"2217a20d-e435-4926-a713-89fc852aab36\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " Mar 18 14:24:52 crc kubenswrapper[4756]: I0318 14:24:52.910773 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gblv\" (UniqueName: \"kubernetes.io/projected/2217a20d-e435-4926-a713-89fc852aab36-kube-api-access-8gblv\") pod \"2217a20d-e435-4926-a713-89fc852aab36\" (UID: \"2217a20d-e435-4926-a713-89fc852aab36\") " Mar 18 14:24:52 crc kubenswrapper[4756]: I0318 14:24:52.923357 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2217a20d-e435-4926-a713-89fc852aab36-kube-api-access-8gblv" (OuterVolumeSpecName: "kube-api-access-8gblv") pod "2217a20d-e435-4926-a713-89fc852aab36" (UID: "2217a20d-e435-4926-a713-89fc852aab36"). InnerVolumeSpecName "kube-api-access-8gblv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:52 crc kubenswrapper[4756]: I0318 14:24:52.932544 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-scripts" (OuterVolumeSpecName: "scripts") pod "2217a20d-e435-4926-a713-89fc852aab36" (UID: "2217a20d-e435-4926-a713-89fc852aab36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:52 crc kubenswrapper[4756]: I0318 14:24:52.948395 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-config-data" (OuterVolumeSpecName: "config-data") pod "2217a20d-e435-4926-a713-89fc852aab36" (UID: "2217a20d-e435-4926-a713-89fc852aab36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:52 crc kubenswrapper[4756]: I0318 14:24:52.953266 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2217a20d-e435-4926-a713-89fc852aab36" (UID: "2217a20d-e435-4926-a713-89fc852aab36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.012853 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gblv\" (UniqueName: \"kubernetes.io/projected/2217a20d-e435-4926-a713-89fc852aab36-kube-api-access-8gblv\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.012892 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.012904 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.012913 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2217a20d-e435-4926-a713-89fc852aab36-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.038323 4756 generic.go:334] "Generic (PLEG): container finished" podID="3b2ea224-eb93-4f10-95e8-a1310e31b70f" containerID="62f71fad038d5624ac73c768dcb7e916ae0deeadd96b9c071097a2c1e2b2aeb6" exitCode=0 Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.038386 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkw76" event={"ID":"3b2ea224-eb93-4f10-95e8-a1310e31b70f","Type":"ContainerDied","Data":"62f71fad038d5624ac73c768dcb7e916ae0deeadd96b9c071097a2c1e2b2aeb6"} Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.040898 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c898ac99-8bd7-467f-8540-0b23e255b8a0","Type":"ContainerStarted","Data":"6b05b0858d1ecf12252983a5a833bf1686ff3f3752e16451acf7dcd5d7fc6549"} Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.042945 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fttqt" event={"ID":"2217a20d-e435-4926-a713-89fc852aab36","Type":"ContainerDied","Data":"accb9ce6bf5098a9a0863826e833b9cdc39243534450b1f263affbf5c850d665"} Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.042970 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="accb9ce6bf5098a9a0863826e833b9cdc39243534450b1f263affbf5c850d665" Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.042991 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fttqt" Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.150410 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="246a5313-e9e8-4819-a58c-65dbc141ea7f" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.235:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.150409 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="246a5313-e9e8-4819-a58c-65dbc141ea7f" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.235:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.217682 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.217967 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="246a5313-e9e8-4819-a58c-65dbc141ea7f" containerName="nova-api-log" containerID="cri-o://f13f5c30666cb5ccf26e0a13f5fa3a5486beda45288b824fed18830bdff20098" gracePeriod=30 Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.218199 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="246a5313-e9e8-4819-a58c-65dbc141ea7f" containerName="nova-api-api" containerID="cri-o://80c7c33720a7b2791aab6c4dadb2da291d712365968b3bf32d7b4ea985b2d65e" gracePeriod=30 Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.265014 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.265383 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="09ffabff-92b2-4f54-b65e-eb3a0104f61e" containerName="nova-scheduler-scheduler" containerID="cri-o://9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1" gracePeriod=30 Mar 18 14:24:53 crc kubenswrapper[4756]: I0318 14:24:53.311217 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:24:54 crc kubenswrapper[4756]: I0318 14:24:54.089217 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkw76" event={"ID":"3b2ea224-eb93-4f10-95e8-a1310e31b70f","Type":"ContainerStarted","Data":"69cc379f440e720c0d23e4cfbd5e0926ba1e0b40fe3119d9e60ba0fbb2089b91"} Mar 18 14:24:54 crc kubenswrapper[4756]: I0318 14:24:54.119576 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c898ac99-8bd7-467f-8540-0b23e255b8a0","Type":"ContainerStarted","Data":"2218946ffdb97078a3ec5a7ae5479a8301bf6c8cc7d335bcab668670aef9761b"} Mar 18 14:24:54 crc kubenswrapper[4756]: I0318 14:24:54.134971 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vkw76" podStartSLOduration=2.520075434 podStartE2EDuration="11.134955892s" podCreationTimestamp="2026-03-18 14:24:43 +0000 UTC" firstStartedPulling="2026-03-18 14:24:44.891534568 +0000 UTC m=+1486.205952543" lastFinishedPulling="2026-03-18 14:24:53.506415026 +0000 UTC m=+1494.820833001" observedRunningTime="2026-03-18 14:24:54.124601002 +0000 UTC m=+1495.439018977" watchObservedRunningTime="2026-03-18 14:24:54.134955892 +0000 UTC m=+1495.449373867" Mar 18 14:24:54 crc kubenswrapper[4756]: I0318 14:24:54.140837 4756 generic.go:334] "Generic (PLEG): container finished" podID="246a5313-e9e8-4819-a58c-65dbc141ea7f" containerID="f13f5c30666cb5ccf26e0a13f5fa3a5486beda45288b824fed18830bdff20098" exitCode=143 Mar 18 14:24:54 crc kubenswrapper[4756]: I0318 14:24:54.141171 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"246a5313-e9e8-4819-a58c-65dbc141ea7f","Type":"ContainerDied","Data":"f13f5c30666cb5ccf26e0a13f5fa3a5486beda45288b824fed18830bdff20098"} Mar 18 14:24:55 crc kubenswrapper[4756]: I0318 14:24:55.150738 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c898ac99-8bd7-467f-8540-0b23e255b8a0","Type":"ContainerStarted","Data":"777aae07aea8c03f4dd308159c968f7e1602eba563f6ff0b8826c4fa32eef7a6"} Mar 18 14:24:55 crc kubenswrapper[4756]: I0318 14:24:55.150891 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" containerName="nova-metadata-log" containerID="cri-o://9ea28543579c8a4a82ebf6cc16c9eaec55097eea58578b417baf645ca889972a" gracePeriod=30 Mar 18 14:24:55 crc kubenswrapper[4756]: I0318 14:24:55.150962 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" containerName="nova-metadata-metadata" containerID="cri-o://1cd38dc7d27a2683d688d561cae96bd617d30db38ab59c51082dfbe8585f365d" gracePeriod=30 Mar 18 14:24:56 crc kubenswrapper[4756]: I0318 14:24:56.160672 4756 generic.go:334] "Generic (PLEG): container finished" podID="5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" containerID="9ea28543579c8a4a82ebf6cc16c9eaec55097eea58578b417baf645ca889972a" exitCode=143 Mar 18 14:24:56 crc kubenswrapper[4756]: I0318 14:24:56.160747 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3","Type":"ContainerDied","Data":"9ea28543579c8a4a82ebf6cc16c9eaec55097eea58578b417baf645ca889972a"} Mar 18 14:24:56 crc kubenswrapper[4756]: E0318 14:24:56.997860 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1 is running failed: container process not found" containerID="9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 14:24:56 crc kubenswrapper[4756]: E0318 14:24:56.998823 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1 is running failed: container process not found" containerID="9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 14:24:56 crc kubenswrapper[4756]: E0318 14:24:56.999099 4756 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1 is running failed: container process not found" containerID="9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 14:24:56 crc kubenswrapper[4756]: E0318 14:24:56.999235 4756 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="09ffabff-92b2-4f54-b65e-eb3a0104f61e" containerName="nova-scheduler-scheduler" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.014510 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.107603 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwwqt\" (UniqueName: \"kubernetes.io/projected/09ffabff-92b2-4f54-b65e-eb3a0104f61e-kube-api-access-pwwqt\") pod \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\" (UID: \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\") " Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.107747 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ffabff-92b2-4f54-b65e-eb3a0104f61e-config-data\") pod \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\" (UID: \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\") " Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.107764 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ffabff-92b2-4f54-b65e-eb3a0104f61e-combined-ca-bundle\") pod \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\" (UID: \"09ffabff-92b2-4f54-b65e-eb3a0104f61e\") " Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.121356 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ffabff-92b2-4f54-b65e-eb3a0104f61e-kube-api-access-pwwqt" (OuterVolumeSpecName: "kube-api-access-pwwqt") pod "09ffabff-92b2-4f54-b65e-eb3a0104f61e" (UID: "09ffabff-92b2-4f54-b65e-eb3a0104f61e"). InnerVolumeSpecName "kube-api-access-pwwqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.139183 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ffabff-92b2-4f54-b65e-eb3a0104f61e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09ffabff-92b2-4f54-b65e-eb3a0104f61e" (UID: "09ffabff-92b2-4f54-b65e-eb3a0104f61e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.140254 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ffabff-92b2-4f54-b65e-eb3a0104f61e-config-data" (OuterVolumeSpecName: "config-data") pod "09ffabff-92b2-4f54-b65e-eb3a0104f61e" (UID: "09ffabff-92b2-4f54-b65e-eb3a0104f61e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.178563 4756 generic.go:334] "Generic (PLEG): container finished" podID="09ffabff-92b2-4f54-b65e-eb3a0104f61e" containerID="9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1" exitCode=0 Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.178720 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09ffabff-92b2-4f54-b65e-eb3a0104f61e","Type":"ContainerDied","Data":"9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1"} Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.179763 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09ffabff-92b2-4f54-b65e-eb3a0104f61e","Type":"ContainerDied","Data":"50fb464daf8c021758506350ee04dbb9b32bd016ab44359083d201bf1d2a9913"} Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.179841 4756 scope.go:117] "RemoveContainer" containerID="9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.178811 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.200574 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c898ac99-8bd7-467f-8540-0b23e255b8a0","Type":"ContainerStarted","Data":"e4038ab35e9263481a511e49a91f67c2b20be05770e28c7a766b3f118332292a"} Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.201321 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.209732 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwwqt\" (UniqueName: \"kubernetes.io/projected/09ffabff-92b2-4f54-b65e-eb3a0104f61e-kube-api-access-pwwqt\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.209765 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ffabff-92b2-4f54-b65e-eb3a0104f61e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.209775 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ffabff-92b2-4f54-b65e-eb3a0104f61e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.211206 4756 scope.go:117] "RemoveContainer" containerID="9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1" Mar 18 14:24:57 crc kubenswrapper[4756]: E0318 14:24:57.211612 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1\": container with ID starting with 9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1 not found: ID does not exist" containerID="9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.211659 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1"} err="failed to get container status \"9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1\": rpc error: code = NotFound desc = could not find container \"9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1\": container with ID starting with 9057c1e98f1328b903dca58dcac7e5bbb1c629ea340058b6bc6611ddc8836db1 not found: ID does not exist" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.229416 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.289594 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.301305 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:24:57 crc kubenswrapper[4756]: E0318 14:24:57.301996 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ffabff-92b2-4f54-b65e-eb3a0104f61e" containerName="nova-scheduler-scheduler" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.302074 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ffabff-92b2-4f54-b65e-eb3a0104f61e" containerName="nova-scheduler-scheduler" Mar 18 14:24:57 crc kubenswrapper[4756]: E0318 14:24:57.302151 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2217a20d-e435-4926-a713-89fc852aab36" containerName="nova-manage" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.302229 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2217a20d-e435-4926-a713-89fc852aab36" containerName="nova-manage" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.302480 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ffabff-92b2-4f54-b65e-eb3a0104f61e" containerName="nova-scheduler-scheduler" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.302552 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2217a20d-e435-4926-a713-89fc852aab36" containerName="nova-manage" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.304044 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.306347 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.310875 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.910530744 podStartE2EDuration="6.310854925s" podCreationTimestamp="2026-03-18 14:24:51 +0000 UTC" firstStartedPulling="2026-03-18 14:24:51.994770753 +0000 UTC m=+1493.309188738" lastFinishedPulling="2026-03-18 14:24:56.395094944 +0000 UTC m=+1497.709512919" observedRunningTime="2026-03-18 14:24:57.238131648 +0000 UTC m=+1498.552549623" watchObservedRunningTime="2026-03-18 14:24:57.310854925 +0000 UTC m=+1498.625272900" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.329869 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ffabff-92b2-4f54-b65e-eb3a0104f61e" path="/var/lib/kubelet/pods/09ffabff-92b2-4f54-b65e-eb3a0104f61e/volumes" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.331155 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.414493 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmp4x\" (UniqueName: \"kubernetes.io/projected/d87ac058-1e5e-4aa6-801d-a7a92e65d112-kube-api-access-xmp4x\") pod \"nova-scheduler-0\" (UID: \"d87ac058-1e5e-4aa6-801d-a7a92e65d112\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.414561 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d87ac058-1e5e-4aa6-801d-a7a92e65d112-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d87ac058-1e5e-4aa6-801d-a7a92e65d112\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.414706 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d87ac058-1e5e-4aa6-801d-a7a92e65d112-config-data\") pod \"nova-scheduler-0\" (UID: \"d87ac058-1e5e-4aa6-801d-a7a92e65d112\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.516833 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d87ac058-1e5e-4aa6-801d-a7a92e65d112-config-data\") pod \"nova-scheduler-0\" (UID: \"d87ac058-1e5e-4aa6-801d-a7a92e65d112\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.517270 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmp4x\" (UniqueName: \"kubernetes.io/projected/d87ac058-1e5e-4aa6-801d-a7a92e65d112-kube-api-access-xmp4x\") pod \"nova-scheduler-0\" (UID: \"d87ac058-1e5e-4aa6-801d-a7a92e65d112\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.517301 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d87ac058-1e5e-4aa6-801d-a7a92e65d112-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d87ac058-1e5e-4aa6-801d-a7a92e65d112\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.525735 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d87ac058-1e5e-4aa6-801d-a7a92e65d112-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d87ac058-1e5e-4aa6-801d-a7a92e65d112\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.528758 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d87ac058-1e5e-4aa6-801d-a7a92e65d112-config-data\") pod \"nova-scheduler-0\" (UID: \"d87ac058-1e5e-4aa6-801d-a7a92e65d112\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.533595 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmp4x\" (UniqueName: \"kubernetes.io/projected/d87ac058-1e5e-4aa6-801d-a7a92e65d112-kube-api-access-xmp4x\") pod \"nova-scheduler-0\" (UID: \"d87ac058-1e5e-4aa6-801d-a7a92e65d112\") " pod="openstack/nova-scheduler-0" Mar 18 14:24:57 crc kubenswrapper[4756]: I0318 14:24:57.619610 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 14:24:58 crc kubenswrapper[4756]: I0318 14:24:58.104556 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 14:24:58 crc kubenswrapper[4756]: I0318 14:24:58.211442 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d87ac058-1e5e-4aa6-801d-a7a92e65d112","Type":"ContainerStarted","Data":"a196e306018a9f8970ea58dcf48b97513149696d36af0e29353acf0950f13fe6"} Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.224447 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d87ac058-1e5e-4aa6-801d-a7a92e65d112","Type":"ContainerStarted","Data":"8d885e1d0f67f016563af46980ac2400d92a938c429daf22cde6e9a95d18309a"} Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.226226 4756 generic.go:334] "Generic (PLEG): container finished" podID="5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" containerID="1cd38dc7d27a2683d688d561cae96bd617d30db38ab59c51082dfbe8585f365d" exitCode=0 Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.226280 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3","Type":"ContainerDied","Data":"1cd38dc7d27a2683d688d561cae96bd617d30db38ab59c51082dfbe8585f365d"} Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.226311 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3","Type":"ContainerDied","Data":"d271eccbda9dbae407d585e2a4b88c1fdb9b02fc598e2ac27a80e23440213ca5"} Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.226322 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d271eccbda9dbae407d585e2a4b88c1fdb9b02fc598e2ac27a80e23440213ca5" Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.241294 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.241278112 podStartE2EDuration="2.241278112s" podCreationTimestamp="2026-03-18 14:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:24:59.239080393 +0000 UTC m=+1500.553498368" watchObservedRunningTime="2026-03-18 14:24:59.241278112 +0000 UTC m=+1500.555696077" Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.291158 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.355184 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-logs\") pod \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.355287 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-combined-ca-bundle\") pod \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.355326 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-nova-metadata-tls-certs\") pod \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.355390 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-config-data\") pod \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.355547 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g58qh\" (UniqueName: \"kubernetes.io/projected/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-kube-api-access-g58qh\") pod \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\" (UID: \"5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3\") " Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.356664 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-logs" (OuterVolumeSpecName: "logs") pod "5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" (UID: "5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.389620 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-kube-api-access-g58qh" (OuterVolumeSpecName: "kube-api-access-g58qh") pod "5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" (UID: "5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3"). InnerVolumeSpecName "kube-api-access-g58qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.465784 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g58qh\" (UniqueName: \"kubernetes.io/projected/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-kube-api-access-g58qh\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.465814 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.472098 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-config-data" (OuterVolumeSpecName: "config-data") pod "5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" (UID: "5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.495294 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" (UID: "5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.497882 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" (UID: "5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.567650 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.567992 4756 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:59 crc kubenswrapper[4756]: I0318 14:24:59.568005 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.130682 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.130740 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.272500 4756 generic.go:334] "Generic (PLEG): container finished" podID="246a5313-e9e8-4819-a58c-65dbc141ea7f" containerID="80c7c33720a7b2791aab6c4dadb2da291d712365968b3bf32d7b4ea985b2d65e" exitCode=0 Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.272599 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.279882 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"246a5313-e9e8-4819-a58c-65dbc141ea7f","Type":"ContainerDied","Data":"80c7c33720a7b2791aab6c4dadb2da291d712365968b3bf32d7b4ea985b2d65e"} Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.388381 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.396709 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.419671 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.435716 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:25:00 crc kubenswrapper[4756]: E0318 14:25:00.436174 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" containerName="nova-metadata-log" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.436187 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" containerName="nova-metadata-log" Mar 18 14:25:00 crc kubenswrapper[4756]: E0318 14:25:00.436216 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="246a5313-e9e8-4819-a58c-65dbc141ea7f" containerName="nova-api-api" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.436222 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="246a5313-e9e8-4819-a58c-65dbc141ea7f" containerName="nova-api-api" Mar 18 14:25:00 crc kubenswrapper[4756]: E0318 14:25:00.436236 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" containerName="nova-metadata-metadata" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.436242 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" containerName="nova-metadata-metadata" Mar 18 14:25:00 crc kubenswrapper[4756]: E0318 14:25:00.436255 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="246a5313-e9e8-4819-a58c-65dbc141ea7f" containerName="nova-api-log" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.436261 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="246a5313-e9e8-4819-a58c-65dbc141ea7f" containerName="nova-api-log" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.436481 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" containerName="nova-metadata-metadata" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.436499 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="246a5313-e9e8-4819-a58c-65dbc141ea7f" containerName="nova-api-api" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.436512 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" containerName="nova-metadata-log" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.436530 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="246a5313-e9e8-4819-a58c-65dbc141ea7f" containerName="nova-api-log" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.437784 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.440801 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.441296 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.446428 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.484671 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-public-tls-certs\") pod \"246a5313-e9e8-4819-a58c-65dbc141ea7f\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.484782 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-config-data\") pod \"246a5313-e9e8-4819-a58c-65dbc141ea7f\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.484874 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-internal-tls-certs\") pod \"246a5313-e9e8-4819-a58c-65dbc141ea7f\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.484893 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-combined-ca-bundle\") pod \"246a5313-e9e8-4819-a58c-65dbc141ea7f\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.484980 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb8jn\" (UniqueName: \"kubernetes.io/projected/246a5313-e9e8-4819-a58c-65dbc141ea7f-kube-api-access-tb8jn\") pod \"246a5313-e9e8-4819-a58c-65dbc141ea7f\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.485051 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/246a5313-e9e8-4819-a58c-65dbc141ea7f-logs\") pod \"246a5313-e9e8-4819-a58c-65dbc141ea7f\" (UID: \"246a5313-e9e8-4819-a58c-65dbc141ea7f\") " Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.485617 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/246a5313-e9e8-4819-a58c-65dbc141ea7f-logs" (OuterVolumeSpecName: "logs") pod "246a5313-e9e8-4819-a58c-65dbc141ea7f" (UID: "246a5313-e9e8-4819-a58c-65dbc141ea7f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.488543 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmnsw\" (UniqueName: \"kubernetes.io/projected/2895ca94-33ed-4aa0-bd42-af3b10592ae4-kube-api-access-gmnsw\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.488769 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2895ca94-33ed-4aa0-bd42-af3b10592ae4-logs\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.488900 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2895ca94-33ed-4aa0-bd42-af3b10592ae4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.488989 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2895ca94-33ed-4aa0-bd42-af3b10592ae4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.489131 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2895ca94-33ed-4aa0-bd42-af3b10592ae4-config-data\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.489281 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/246a5313-e9e8-4819-a58c-65dbc141ea7f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.490160 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/246a5313-e9e8-4819-a58c-65dbc141ea7f-kube-api-access-tb8jn" (OuterVolumeSpecName: "kube-api-access-tb8jn") pod "246a5313-e9e8-4819-a58c-65dbc141ea7f" (UID: "246a5313-e9e8-4819-a58c-65dbc141ea7f"). InnerVolumeSpecName "kube-api-access-tb8jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.532291 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "246a5313-e9e8-4819-a58c-65dbc141ea7f" (UID: "246a5313-e9e8-4819-a58c-65dbc141ea7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.541245 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-config-data" (OuterVolumeSpecName: "config-data") pod "246a5313-e9e8-4819-a58c-65dbc141ea7f" (UID: "246a5313-e9e8-4819-a58c-65dbc141ea7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:25:00 crc kubenswrapper[4756]: E0318 14:25:00.546146 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5873292f_2fc4_4a9d_b7ad_8fac22f3f9d3.slice\": RecentStats: unable to find data in memory cache]" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.577643 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "246a5313-e9e8-4819-a58c-65dbc141ea7f" (UID: "246a5313-e9e8-4819-a58c-65dbc141ea7f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.590356 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "246a5313-e9e8-4819-a58c-65dbc141ea7f" (UID: "246a5313-e9e8-4819-a58c-65dbc141ea7f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.591382 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmnsw\" (UniqueName: \"kubernetes.io/projected/2895ca94-33ed-4aa0-bd42-af3b10592ae4-kube-api-access-gmnsw\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.591457 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2895ca94-33ed-4aa0-bd42-af3b10592ae4-logs\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.591505 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2895ca94-33ed-4aa0-bd42-af3b10592ae4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.591539 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2895ca94-33ed-4aa0-bd42-af3b10592ae4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.591600 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2895ca94-33ed-4aa0-bd42-af3b10592ae4-config-data\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.591662 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb8jn\" (UniqueName: \"kubernetes.io/projected/246a5313-e9e8-4819-a58c-65dbc141ea7f-kube-api-access-tb8jn\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.591678 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.591687 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.591696 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.591716 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246a5313-e9e8-4819-a58c-65dbc141ea7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.592927 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2895ca94-33ed-4aa0-bd42-af3b10592ae4-logs\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.596699 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2895ca94-33ed-4aa0-bd42-af3b10592ae4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.596717 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2895ca94-33ed-4aa0-bd42-af3b10592ae4-config-data\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.607730 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2895ca94-33ed-4aa0-bd42-af3b10592ae4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.608175 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmnsw\" (UniqueName: \"kubernetes.io/projected/2895ca94-33ed-4aa0-bd42-af3b10592ae4-kube-api-access-gmnsw\") pod \"nova-metadata-0\" (UID: \"2895ca94-33ed-4aa0-bd42-af3b10592ae4\") " pod="openstack/nova-metadata-0" Mar 18 14:25:00 crc kubenswrapper[4756]: I0318 14:25:00.752822 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.196614 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.281420 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2895ca94-33ed-4aa0-bd42-af3b10592ae4","Type":"ContainerStarted","Data":"cd23bb0cb17d8189e8b653648fcb64b41bfeddfaf52a1f41b5c5aa6b67b3d762"} Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.283410 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"246a5313-e9e8-4819-a58c-65dbc141ea7f","Type":"ContainerDied","Data":"911c71a9a924ed2f4fc2de67d21150ab42d959575bd35c5d0bb136f46068b8cc"} Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.283443 4756 scope.go:117] "RemoveContainer" containerID="80c7c33720a7b2791aab6c4dadb2da291d712365968b3bf32d7b4ea985b2d65e" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.283588 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.305461 4756 scope.go:117] "RemoveContainer" containerID="f13f5c30666cb5ccf26e0a13f5fa3a5486beda45288b824fed18830bdff20098" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.332379 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3" path="/var/lib/kubelet/pods/5873292f-2fc4-4a9d-b7ad-8fac22f3f9d3/volumes" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.350943 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.375369 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.392310 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.405472 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.417838 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.418035 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.418179 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.445267 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.512873 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a634960a-3e75-4837-a471-0a228302abe0-public-tls-certs\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.512934 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a634960a-3e75-4837-a471-0a228302abe0-config-data\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.512979 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ldx2\" (UniqueName: \"kubernetes.io/projected/a634960a-3e75-4837-a471-0a228302abe0-kube-api-access-6ldx2\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.513004 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a634960a-3e75-4837-a471-0a228302abe0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.513050 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a634960a-3e75-4837-a471-0a228302abe0-logs\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.513082 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a634960a-3e75-4837-a471-0a228302abe0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.615525 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a634960a-3e75-4837-a471-0a228302abe0-public-tls-certs\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.615590 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a634960a-3e75-4837-a471-0a228302abe0-config-data\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.615637 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldx2\" (UniqueName: \"kubernetes.io/projected/a634960a-3e75-4837-a471-0a228302abe0-kube-api-access-6ldx2\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.615684 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a634960a-3e75-4837-a471-0a228302abe0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.615733 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a634960a-3e75-4837-a471-0a228302abe0-logs\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.615766 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a634960a-3e75-4837-a471-0a228302abe0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.617781 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a634960a-3e75-4837-a471-0a228302abe0-logs\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.623471 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a634960a-3e75-4837-a471-0a228302abe0-public-tls-certs\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.629845 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a634960a-3e75-4837-a471-0a228302abe0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.630959 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a634960a-3e75-4837-a471-0a228302abe0-config-data\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.631489 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a634960a-3e75-4837-a471-0a228302abe0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.665581 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ldx2\" (UniqueName: \"kubernetes.io/projected/a634960a-3e75-4837-a471-0a228302abe0-kube-api-access-6ldx2\") pod \"nova-api-0\" (UID: \"a634960a-3e75-4837-a471-0a228302abe0\") " pod="openstack/nova-api-0" Mar 18 14:25:01 crc kubenswrapper[4756]: I0318 14:25:01.752820 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 14:25:02 crc kubenswrapper[4756]: W0318 14:25:02.251683 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda634960a_3e75_4837_a471_0a228302abe0.slice/crio-e8799c371d370406c4df16238694e028aac4d7495d91eeb8d47af27d5e38bfc3 WatchSource:0}: Error finding container e8799c371d370406c4df16238694e028aac4d7495d91eeb8d47af27d5e38bfc3: Status 404 returned error can't find the container with id e8799c371d370406c4df16238694e028aac4d7495d91eeb8d47af27d5e38bfc3 Mar 18 14:25:02 crc kubenswrapper[4756]: I0318 14:25:02.252095 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 14:25:02 crc kubenswrapper[4756]: I0318 14:25:02.307095 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a634960a-3e75-4837-a471-0a228302abe0","Type":"ContainerStarted","Data":"e8799c371d370406c4df16238694e028aac4d7495d91eeb8d47af27d5e38bfc3"} Mar 18 14:25:02 crc kubenswrapper[4756]: I0318 14:25:02.312940 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2895ca94-33ed-4aa0-bd42-af3b10592ae4","Type":"ContainerStarted","Data":"695c3db9ce4d746e63c32b9d45ca180849e215d31a8b9b5543f1d6bb89118310"} Mar 18 14:25:02 crc kubenswrapper[4756]: I0318 14:25:02.312979 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2895ca94-33ed-4aa0-bd42-af3b10592ae4","Type":"ContainerStarted","Data":"c8b853043089852751e58df0f16a78202f38f49cd451859db277b3acb5eaf95f"} Mar 18 14:25:02 crc kubenswrapper[4756]: I0318 14:25:02.334591 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.334569602 podStartE2EDuration="2.334569602s" podCreationTimestamp="2026-03-18 14:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:25:02.327431618 +0000 UTC m=+1503.641849583" watchObservedRunningTime="2026-03-18 14:25:02.334569602 +0000 UTC m=+1503.648987577" Mar 18 14:25:02 crc kubenswrapper[4756]: I0318 14:25:02.620205 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 14:25:03 crc kubenswrapper[4756]: I0318 14:25:03.327568 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="246a5313-e9e8-4819-a58c-65dbc141ea7f" path="/var/lib/kubelet/pods/246a5313-e9e8-4819-a58c-65dbc141ea7f/volumes" Mar 18 14:25:03 crc kubenswrapper[4756]: I0318 14:25:03.330714 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a634960a-3e75-4837-a471-0a228302abe0","Type":"ContainerStarted","Data":"76d17460365789b2255e4e614f1c26419b521be950a00ff4bc0936c4f76816ac"} Mar 18 14:25:03 crc kubenswrapper[4756]: I0318 14:25:03.330759 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a634960a-3e75-4837-a471-0a228302abe0","Type":"ContainerStarted","Data":"96595f054968ed3485061cf8c01439481baf7d80d8e68b41dfb68921b207e509"} Mar 18 14:25:03 crc kubenswrapper[4756]: I0318 14:25:03.359726 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.359702361 podStartE2EDuration="2.359702361s" podCreationTimestamp="2026-03-18 14:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:25:03.350298586 +0000 UTC m=+1504.664716561" watchObservedRunningTime="2026-03-18 14:25:03.359702361 +0000 UTC m=+1504.674120336" Mar 18 14:25:03 crc kubenswrapper[4756]: I0318 14:25:03.382947 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:25:03 crc kubenswrapper[4756]: I0318 14:25:03.383251 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:25:04 crc kubenswrapper[4756]: I0318 14:25:04.428837 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vkw76" podUID="3b2ea224-eb93-4f10-95e8-a1310e31b70f" containerName="registry-server" probeResult="failure" output=< Mar 18 14:25:04 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:25:04 crc kubenswrapper[4756]: > Mar 18 14:25:06 crc kubenswrapper[4756]: I0318 14:25:06.916028 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:25:06 crc kubenswrapper[4756]: I0318 14:25:06.916307 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:25:07 crc kubenswrapper[4756]: I0318 14:25:07.620769 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 14:25:07 crc kubenswrapper[4756]: I0318 14:25:07.651700 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 14:25:08 crc kubenswrapper[4756]: I0318 14:25:08.401344 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 14:25:10 crc kubenswrapper[4756]: I0318 14:25:10.753296 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 14:25:10 crc kubenswrapper[4756]: I0318 14:25:10.753657 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 14:25:11 crc kubenswrapper[4756]: I0318 14:25:11.753481 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 14:25:11 crc kubenswrapper[4756]: I0318 14:25:11.753858 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 14:25:11 crc kubenswrapper[4756]: I0318 14:25:11.761296 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2895ca94-33ed-4aa0-bd42-af3b10592ae4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.240:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:25:11 crc kubenswrapper[4756]: I0318 14:25:11.772368 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2895ca94-33ed-4aa0-bd42-af3b10592ae4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.240:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:25:12 crc kubenswrapper[4756]: I0318 14:25:12.804274 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a634960a-3e75-4837-a471-0a228302abe0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.241:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:25:12 crc kubenswrapper[4756]: I0318 14:25:12.804302 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a634960a-3e75-4837-a471-0a228302abe0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.241:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 14:25:13 crc kubenswrapper[4756]: I0318 14:25:13.457061 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:25:13 crc kubenswrapper[4756]: I0318 14:25:13.509209 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:25:13 crc kubenswrapper[4756]: I0318 14:25:13.715562 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vkw76"] Mar 18 14:25:15 crc kubenswrapper[4756]: I0318 14:25:15.443152 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vkw76" podUID="3b2ea224-eb93-4f10-95e8-a1310e31b70f" containerName="registry-server" containerID="cri-o://69cc379f440e720c0d23e4cfbd5e0926ba1e0b40fe3119d9e60ba0fbb2089b91" gracePeriod=2 Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.298989 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.409363 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfg4l\" (UniqueName: \"kubernetes.io/projected/3b2ea224-eb93-4f10-95e8-a1310e31b70f-kube-api-access-cfg4l\") pod \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\" (UID: \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\") " Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.409480 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea224-eb93-4f10-95e8-a1310e31b70f-catalog-content\") pod \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\" (UID: \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\") " Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.409516 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea224-eb93-4f10-95e8-a1310e31b70f-utilities\") pod \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\" (UID: \"3b2ea224-eb93-4f10-95e8-a1310e31b70f\") " Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.410307 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b2ea224-eb93-4f10-95e8-a1310e31b70f-utilities" (OuterVolumeSpecName: "utilities") pod "3b2ea224-eb93-4f10-95e8-a1310e31b70f" (UID: "3b2ea224-eb93-4f10-95e8-a1310e31b70f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.415002 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b2ea224-eb93-4f10-95e8-a1310e31b70f-kube-api-access-cfg4l" (OuterVolumeSpecName: "kube-api-access-cfg4l") pod "3b2ea224-eb93-4f10-95e8-a1310e31b70f" (UID: "3b2ea224-eb93-4f10-95e8-a1310e31b70f"). InnerVolumeSpecName "kube-api-access-cfg4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.453852 4756 generic.go:334] "Generic (PLEG): container finished" podID="3b2ea224-eb93-4f10-95e8-a1310e31b70f" containerID="69cc379f440e720c0d23e4cfbd5e0926ba1e0b40fe3119d9e60ba0fbb2089b91" exitCode=0 Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.453893 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkw76" event={"ID":"3b2ea224-eb93-4f10-95e8-a1310e31b70f","Type":"ContainerDied","Data":"69cc379f440e720c0d23e4cfbd5e0926ba1e0b40fe3119d9e60ba0fbb2089b91"} Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.453945 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkw76" event={"ID":"3b2ea224-eb93-4f10-95e8-a1310e31b70f","Type":"ContainerDied","Data":"eeeba384b65df231afe660dafae83a84deda2bd0f77b66c8b74a6a130ff29096"} Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.453963 4756 scope.go:117] "RemoveContainer" containerID="69cc379f440e720c0d23e4cfbd5e0926ba1e0b40fe3119d9e60ba0fbb2089b91" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.453961 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkw76" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.491955 4756 scope.go:117] "RemoveContainer" containerID="62f71fad038d5624ac73c768dcb7e916ae0deeadd96b9c071097a2c1e2b2aeb6" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.514351 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfg4l\" (UniqueName: \"kubernetes.io/projected/3b2ea224-eb93-4f10-95e8-a1310e31b70f-kube-api-access-cfg4l\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.514398 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea224-eb93-4f10-95e8-a1310e31b70f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.525194 4756 scope.go:117] "RemoveContainer" containerID="26d5fbb28b1ab434c8352ffffbb19953a7c2d2cdd7935a5375e8f9b7f75816ff" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.561208 4756 scope.go:117] "RemoveContainer" containerID="69cc379f440e720c0d23e4cfbd5e0926ba1e0b40fe3119d9e60ba0fbb2089b91" Mar 18 14:25:16 crc kubenswrapper[4756]: E0318 14:25:16.562926 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69cc379f440e720c0d23e4cfbd5e0926ba1e0b40fe3119d9e60ba0fbb2089b91\": container with ID starting with 69cc379f440e720c0d23e4cfbd5e0926ba1e0b40fe3119d9e60ba0fbb2089b91 not found: ID does not exist" containerID="69cc379f440e720c0d23e4cfbd5e0926ba1e0b40fe3119d9e60ba0fbb2089b91" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.562983 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69cc379f440e720c0d23e4cfbd5e0926ba1e0b40fe3119d9e60ba0fbb2089b91"} err="failed to get container status \"69cc379f440e720c0d23e4cfbd5e0926ba1e0b40fe3119d9e60ba0fbb2089b91\": rpc error: code = NotFound desc = could not find container \"69cc379f440e720c0d23e4cfbd5e0926ba1e0b40fe3119d9e60ba0fbb2089b91\": container with ID starting with 69cc379f440e720c0d23e4cfbd5e0926ba1e0b40fe3119d9e60ba0fbb2089b91 not found: ID does not exist" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.563010 4756 scope.go:117] "RemoveContainer" containerID="62f71fad038d5624ac73c768dcb7e916ae0deeadd96b9c071097a2c1e2b2aeb6" Mar 18 14:25:16 crc kubenswrapper[4756]: E0318 14:25:16.563295 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f71fad038d5624ac73c768dcb7e916ae0deeadd96b9c071097a2c1e2b2aeb6\": container with ID starting with 62f71fad038d5624ac73c768dcb7e916ae0deeadd96b9c071097a2c1e2b2aeb6 not found: ID does not exist" containerID="62f71fad038d5624ac73c768dcb7e916ae0deeadd96b9c071097a2c1e2b2aeb6" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.563326 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f71fad038d5624ac73c768dcb7e916ae0deeadd96b9c071097a2c1e2b2aeb6"} err="failed to get container status \"62f71fad038d5624ac73c768dcb7e916ae0deeadd96b9c071097a2c1e2b2aeb6\": rpc error: code = NotFound desc = could not find container \"62f71fad038d5624ac73c768dcb7e916ae0deeadd96b9c071097a2c1e2b2aeb6\": container with ID starting with 62f71fad038d5624ac73c768dcb7e916ae0deeadd96b9c071097a2c1e2b2aeb6 not found: ID does not exist" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.563345 4756 scope.go:117] "RemoveContainer" containerID="26d5fbb28b1ab434c8352ffffbb19953a7c2d2cdd7935a5375e8f9b7f75816ff" Mar 18 14:25:16 crc kubenswrapper[4756]: E0318 14:25:16.564324 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d5fbb28b1ab434c8352ffffbb19953a7c2d2cdd7935a5375e8f9b7f75816ff\": container with ID starting with 26d5fbb28b1ab434c8352ffffbb19953a7c2d2cdd7935a5375e8f9b7f75816ff not found: ID does not exist" containerID="26d5fbb28b1ab434c8352ffffbb19953a7c2d2cdd7935a5375e8f9b7f75816ff" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.564377 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d5fbb28b1ab434c8352ffffbb19953a7c2d2cdd7935a5375e8f9b7f75816ff"} err="failed to get container status \"26d5fbb28b1ab434c8352ffffbb19953a7c2d2cdd7935a5375e8f9b7f75816ff\": rpc error: code = NotFound desc = could not find container \"26d5fbb28b1ab434c8352ffffbb19953a7c2d2cdd7935a5375e8f9b7f75816ff\": container with ID starting with 26d5fbb28b1ab434c8352ffffbb19953a7c2d2cdd7935a5375e8f9b7f75816ff not found: ID does not exist" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.564809 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b2ea224-eb93-4f10-95e8-a1310e31b70f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b2ea224-eb93-4f10-95e8-a1310e31b70f" (UID: "3b2ea224-eb93-4f10-95e8-a1310e31b70f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.616053 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea224-eb93-4f10-95e8-a1310e31b70f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.788702 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vkw76"] Mar 18 14:25:16 crc kubenswrapper[4756]: I0318 14:25:16.821194 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vkw76"] Mar 18 14:25:17 crc kubenswrapper[4756]: I0318 14:25:17.329381 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b2ea224-eb93-4f10-95e8-a1310e31b70f" path="/var/lib/kubelet/pods/3b2ea224-eb93-4f10-95e8-a1310e31b70f/volumes" Mar 18 14:25:18 crc kubenswrapper[4756]: I0318 14:25:18.753046 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 14:25:18 crc kubenswrapper[4756]: I0318 14:25:18.753432 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 14:25:19 crc kubenswrapper[4756]: I0318 14:25:19.753983 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 14:25:19 crc kubenswrapper[4756]: I0318 14:25:19.755128 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 14:25:20 crc kubenswrapper[4756]: I0318 14:25:20.758508 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 14:25:20 crc kubenswrapper[4756]: I0318 14:25:20.758637 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 14:25:20 crc kubenswrapper[4756]: I0318 14:25:20.779357 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 14:25:20 crc kubenswrapper[4756]: I0318 14:25:20.779984 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 14:25:21 crc kubenswrapper[4756]: I0318 14:25:21.488719 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 14:25:21 crc kubenswrapper[4756]: I0318 14:25:21.777930 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 14:25:21 crc kubenswrapper[4756]: I0318 14:25:21.782611 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 14:25:21 crc kubenswrapper[4756]: I0318 14:25:21.785649 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 14:25:22 crc kubenswrapper[4756]: I0318 14:25:22.517072 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.684848 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-tpt8q"] Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.698292 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-tpt8q"] Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.768244 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-lxm7t"] Mar 18 14:25:32 crc kubenswrapper[4756]: E0318 14:25:32.768775 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2ea224-eb93-4f10-95e8-a1310e31b70f" containerName="extract-content" Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.768791 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2ea224-eb93-4f10-95e8-a1310e31b70f" containerName="extract-content" Mar 18 14:25:32 crc kubenswrapper[4756]: E0318 14:25:32.768799 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2ea224-eb93-4f10-95e8-a1310e31b70f" containerName="registry-server" Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.768806 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2ea224-eb93-4f10-95e8-a1310e31b70f" containerName="registry-server" Mar 18 14:25:32 crc kubenswrapper[4756]: E0318 14:25:32.768836 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2ea224-eb93-4f10-95e8-a1310e31b70f" containerName="extract-utilities" Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.768844 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2ea224-eb93-4f10-95e8-a1310e31b70f" containerName="extract-utilities" Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.769056 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b2ea224-eb93-4f10-95e8-a1310e31b70f" containerName="registry-server" Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.769891 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.772720 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.778786 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-lxm7t"] Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.940605 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wktnb\" (UniqueName: \"kubernetes.io/projected/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-kube-api-access-wktnb\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.941249 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-config-data\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.941390 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-combined-ca-bundle\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.941531 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-scripts\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:32 crc kubenswrapper[4756]: I0318 14:25:32.941659 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-certs\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.043708 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-certs\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.044100 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wktnb\" (UniqueName: \"kubernetes.io/projected/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-kube-api-access-wktnb\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.044157 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-config-data\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.044199 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-combined-ca-bundle\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.044248 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-scripts\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.049577 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-scripts\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.050004 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-config-data\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.050817 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-combined-ca-bundle\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.051595 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-certs\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.067270 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wktnb\" (UniqueName: \"kubernetes.io/projected/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-kube-api-access-wktnb\") pod \"cloudkitty-db-sync-lxm7t\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.085496 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.336658 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7401eed2-7f0c-4f80-932b-5bc2df6684f8" path="/var/lib/kubelet/pods/7401eed2-7f0c-4f80-932b-5bc2df6684f8/volumes" Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.626902 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-lxm7t"] Mar 18 14:25:33 crc kubenswrapper[4756]: I0318 14:25:33.636933 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:25:34 crc kubenswrapper[4756]: I0318 14:25:34.643249 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-lxm7t" event={"ID":"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2","Type":"ContainerStarted","Data":"e2b2c7369ced4c238693dbe6913e6211885f8d0a269eaaa60340823e10cebdc7"} Mar 18 14:25:34 crc kubenswrapper[4756]: I0318 14:25:34.822858 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:25:34 crc kubenswrapper[4756]: I0318 14:25:34.823101 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="ceilometer-central-agent" containerID="cri-o://6b05b0858d1ecf12252983a5a833bf1686ff3f3752e16451acf7dcd5d7fc6549" gracePeriod=30 Mar 18 14:25:34 crc kubenswrapper[4756]: I0318 14:25:34.823535 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="proxy-httpd" containerID="cri-o://e4038ab35e9263481a511e49a91f67c2b20be05770e28c7a766b3f118332292a" gracePeriod=30 Mar 18 14:25:34 crc kubenswrapper[4756]: I0318 14:25:34.823585 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="sg-core" containerID="cri-o://777aae07aea8c03f4dd308159c968f7e1602eba563f6ff0b8826c4fa32eef7a6" gracePeriod=30 Mar 18 14:25:34 crc kubenswrapper[4756]: I0318 14:25:34.823616 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="ceilometer-notification-agent" containerID="cri-o://2218946ffdb97078a3ec5a7ae5479a8301bf6c8cc7d335bcab668670aef9761b" gracePeriod=30 Mar 18 14:25:34 crc kubenswrapper[4756]: I0318 14:25:34.980667 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 14:25:35 crc kubenswrapper[4756]: I0318 14:25:35.661041 4756 generic.go:334] "Generic (PLEG): container finished" podID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerID="e4038ab35e9263481a511e49a91f67c2b20be05770e28c7a766b3f118332292a" exitCode=0 Mar 18 14:25:35 crc kubenswrapper[4756]: I0318 14:25:35.661399 4756 generic.go:334] "Generic (PLEG): container finished" podID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerID="777aae07aea8c03f4dd308159c968f7e1602eba563f6ff0b8826c4fa32eef7a6" exitCode=2 Mar 18 14:25:35 crc kubenswrapper[4756]: I0318 14:25:35.661408 4756 generic.go:334] "Generic (PLEG): container finished" podID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerID="6b05b0858d1ecf12252983a5a833bf1686ff3f3752e16451acf7dcd5d7fc6549" exitCode=0 Mar 18 14:25:35 crc kubenswrapper[4756]: I0318 14:25:35.661126 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c898ac99-8bd7-467f-8540-0b23e255b8a0","Type":"ContainerDied","Data":"e4038ab35e9263481a511e49a91f67c2b20be05770e28c7a766b3f118332292a"} Mar 18 14:25:35 crc kubenswrapper[4756]: I0318 14:25:35.661442 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c898ac99-8bd7-467f-8540-0b23e255b8a0","Type":"ContainerDied","Data":"777aae07aea8c03f4dd308159c968f7e1602eba563f6ff0b8826c4fa32eef7a6"} Mar 18 14:25:35 crc kubenswrapper[4756]: I0318 14:25:35.661455 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c898ac99-8bd7-467f-8540-0b23e255b8a0","Type":"ContainerDied","Data":"6b05b0858d1ecf12252983a5a833bf1686ff3f3752e16451acf7dcd5d7fc6549"} Mar 18 14:25:35 crc kubenswrapper[4756]: I0318 14:25:35.954244 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 14:25:36 crc kubenswrapper[4756]: I0318 14:25:36.915758 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:25:36 crc kubenswrapper[4756]: I0318 14:25:36.916111 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:25:38 crc kubenswrapper[4756]: I0318 14:25:38.696548 4756 generic.go:334] "Generic (PLEG): container finished" podID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerID="2218946ffdb97078a3ec5a7ae5479a8301bf6c8cc7d335bcab668670aef9761b" exitCode=0 Mar 18 14:25:38 crc kubenswrapper[4756]: I0318 14:25:38.696617 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c898ac99-8bd7-467f-8540-0b23e255b8a0","Type":"ContainerDied","Data":"2218946ffdb97078a3ec5a7ae5479a8301bf6c8cc7d335bcab668670aef9761b"} Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.045778 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.191513 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7k4m\" (UniqueName: \"kubernetes.io/projected/c898ac99-8bd7-467f-8540-0b23e255b8a0-kube-api-access-t7k4m\") pod \"c898ac99-8bd7-467f-8540-0b23e255b8a0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.191583 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-config-data\") pod \"c898ac99-8bd7-467f-8540-0b23e255b8a0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.191651 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c898ac99-8bd7-467f-8540-0b23e255b8a0-log-httpd\") pod \"c898ac99-8bd7-467f-8540-0b23e255b8a0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.191687 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c898ac99-8bd7-467f-8540-0b23e255b8a0-run-httpd\") pod \"c898ac99-8bd7-467f-8540-0b23e255b8a0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.191793 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-combined-ca-bundle\") pod \"c898ac99-8bd7-467f-8540-0b23e255b8a0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.191826 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-ceilometer-tls-certs\") pod \"c898ac99-8bd7-467f-8540-0b23e255b8a0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.191860 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-scripts\") pod \"c898ac99-8bd7-467f-8540-0b23e255b8a0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.191926 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-sg-core-conf-yaml\") pod \"c898ac99-8bd7-467f-8540-0b23e255b8a0\" (UID: \"c898ac99-8bd7-467f-8540-0b23e255b8a0\") " Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.192172 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c898ac99-8bd7-467f-8540-0b23e255b8a0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c898ac99-8bd7-467f-8540-0b23e255b8a0" (UID: "c898ac99-8bd7-467f-8540-0b23e255b8a0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.192408 4756 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c898ac99-8bd7-467f-8540-0b23e255b8a0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.194267 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c898ac99-8bd7-467f-8540-0b23e255b8a0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c898ac99-8bd7-467f-8540-0b23e255b8a0" (UID: "c898ac99-8bd7-467f-8540-0b23e255b8a0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.202745 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-scripts" (OuterVolumeSpecName: "scripts") pod "c898ac99-8bd7-467f-8540-0b23e255b8a0" (UID: "c898ac99-8bd7-467f-8540-0b23e255b8a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.203866 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c898ac99-8bd7-467f-8540-0b23e255b8a0-kube-api-access-t7k4m" (OuterVolumeSpecName: "kube-api-access-t7k4m") pod "c898ac99-8bd7-467f-8540-0b23e255b8a0" (UID: "c898ac99-8bd7-467f-8540-0b23e255b8a0"). InnerVolumeSpecName "kube-api-access-t7k4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.239738 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c898ac99-8bd7-467f-8540-0b23e255b8a0" (UID: "c898ac99-8bd7-467f-8540-0b23e255b8a0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.296240 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.296275 4756 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.296309 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7k4m\" (UniqueName: \"kubernetes.io/projected/c898ac99-8bd7-467f-8540-0b23e255b8a0-kube-api-access-t7k4m\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.296319 4756 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c898ac99-8bd7-467f-8540-0b23e255b8a0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.402072 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c898ac99-8bd7-467f-8540-0b23e255b8a0" (UID: "c898ac99-8bd7-467f-8540-0b23e255b8a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.405956 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c898ac99-8bd7-467f-8540-0b23e255b8a0" (UID: "c898ac99-8bd7-467f-8540-0b23e255b8a0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.463926 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-config-data" (OuterVolumeSpecName: "config-data") pod "c898ac99-8bd7-467f-8540-0b23e255b8a0" (UID: "c898ac99-8bd7-467f-8540-0b23e255b8a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.501416 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.501449 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.501460 4756 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c898ac99-8bd7-467f-8540-0b23e255b8a0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.715850 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c898ac99-8bd7-467f-8540-0b23e255b8a0","Type":"ContainerDied","Data":"aeb7174ec7d58ed04c25ee870da85d041b9dd65b9699d1673b5ef50da5b13817"} Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.716195 4756 scope.go:117] "RemoveContainer" containerID="e4038ab35e9263481a511e49a91f67c2b20be05770e28c7a766b3f118332292a" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.716371 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.751678 4756 scope.go:117] "RemoveContainer" containerID="777aae07aea8c03f4dd308159c968f7e1602eba563f6ff0b8826c4fa32eef7a6" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.755939 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.775733 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.786838 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.790278 4756 scope.go:117] "RemoveContainer" containerID="2218946ffdb97078a3ec5a7ae5479a8301bf6c8cc7d335bcab668670aef9761b" Mar 18 14:25:39 crc kubenswrapper[4756]: E0318 14:25:39.792010 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="sg-core" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.792042 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="sg-core" Mar 18 14:25:39 crc kubenswrapper[4756]: E0318 14:25:39.792061 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="proxy-httpd" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.792067 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="proxy-httpd" Mar 18 14:25:39 crc kubenswrapper[4756]: E0318 14:25:39.792082 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="ceilometer-central-agent" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.792090 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="ceilometer-central-agent" Mar 18 14:25:39 crc kubenswrapper[4756]: E0318 14:25:39.792106 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="ceilometer-notification-agent" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.792112 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="ceilometer-notification-agent" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.792309 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="proxy-httpd" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.792319 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="ceilometer-notification-agent" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.792332 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="sg-core" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.792343 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" containerName="ceilometer-central-agent" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.794559 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.799657 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.799873 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.800074 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.823312 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.828267 4756 scope.go:117] "RemoveContainer" containerID="6b05b0858d1ecf12252983a5a833bf1686ff3f3752e16451acf7dcd5d7fc6549" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.908285 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.908321 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krl4v\" (UniqueName: \"kubernetes.io/projected/14a36d4c-c545-467d-a0a8-0aa38de63eb1-kube-api-access-krl4v\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.908346 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-scripts\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.908399 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.908434 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a36d4c-c545-467d-a0a8-0aa38de63eb1-log-httpd\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.908472 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-config-data\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.908496 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.908516 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a36d4c-c545-467d-a0a8-0aa38de63eb1-run-httpd\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:39 crc kubenswrapper[4756]: I0318 14:25:39.934709 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" containerName="rabbitmq" containerID="cri-o://2553af4c815e015e2957c0e66da515607ec6a5ad37907645a3db8f52bc3adf5b" gracePeriod=604796 Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.010086 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.010156 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a36d4c-c545-467d-a0a8-0aa38de63eb1-log-httpd\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.010206 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-config-data\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.010230 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.010252 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a36d4c-c545-467d-a0a8-0aa38de63eb1-run-httpd\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.010326 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.010344 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krl4v\" (UniqueName: \"kubernetes.io/projected/14a36d4c-c545-467d-a0a8-0aa38de63eb1-kube-api-access-krl4v\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.010365 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-scripts\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.010734 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a36d4c-c545-467d-a0a8-0aa38de63eb1-log-httpd\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.011286 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a36d4c-c545-467d-a0a8-0aa38de63eb1-run-httpd\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.015665 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.015952 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-scripts\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.017084 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-config-data\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.017617 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.017781 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a36d4c-c545-467d-a0a8-0aa38de63eb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.027074 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krl4v\" (UniqueName: \"kubernetes.io/projected/14a36d4c-c545-467d-a0a8-0aa38de63eb1-kube-api-access-krl4v\") pod \"ceilometer-0\" (UID: \"14a36d4c-c545-467d-a0a8-0aa38de63eb1\") " pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.118283 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.645455 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 14:25:40 crc kubenswrapper[4756]: I0318 14:25:40.860745 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="228ca85e-a493-4dc4-9b95-5148c92ba228" containerName="rabbitmq" containerID="cri-o://7c17f87498ec948bd713efcefc461bae70853e835995551fae83a4dd7fe7ecd1" gracePeriod=604796 Mar 18 14:25:41 crc kubenswrapper[4756]: I0318 14:25:41.329951 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c898ac99-8bd7-467f-8540-0b23e255b8a0" path="/var/lib/kubelet/pods/c898ac99-8bd7-467f-8540-0b23e255b8a0/volumes" Mar 18 14:25:44 crc kubenswrapper[4756]: I0318 14:25:44.648246 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="228ca85e-a493-4dc4-9b95-5148c92ba228" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Mar 18 14:25:44 crc kubenswrapper[4756]: I0318 14:25:44.966199 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Mar 18 14:25:46 crc kubenswrapper[4756]: I0318 14:25:46.783716 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a36d4c-c545-467d-a0a8-0aa38de63eb1","Type":"ContainerStarted","Data":"e2e415c426af14d3c16dbc1b98b08b60e4a883a3e67c774e4b9c5240b5e9a071"} Mar 18 14:25:46 crc kubenswrapper[4756]: I0318 14:25:46.786907 4756 generic.go:334] "Generic (PLEG): container finished" podID="3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" containerID="2553af4c815e015e2957c0e66da515607ec6a5ad37907645a3db8f52bc3adf5b" exitCode=0 Mar 18 14:25:46 crc kubenswrapper[4756]: I0318 14:25:46.786937 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce","Type":"ContainerDied","Data":"2553af4c815e015e2957c0e66da515607ec6a5ad37907645a3db8f52bc3adf5b"} Mar 18 14:25:47 crc kubenswrapper[4756]: I0318 14:25:47.804237 4756 generic.go:334] "Generic (PLEG): container finished" podID="228ca85e-a493-4dc4-9b95-5148c92ba228" containerID="7c17f87498ec948bd713efcefc461bae70853e835995551fae83a4dd7fe7ecd1" exitCode=0 Mar 18 14:25:47 crc kubenswrapper[4756]: I0318 14:25:47.804319 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"228ca85e-a493-4dc4-9b95-5148c92ba228","Type":"ContainerDied","Data":"7c17f87498ec948bd713efcefc461bae70853e835995551fae83a4dd7fe7ecd1"} Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.477376 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-6x4hm"] Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.479515 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.481885 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.487538 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-6x4hm"] Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.517357 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.517405 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-config\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.517425 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vhms\" (UniqueName: \"kubernetes.io/projected/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-kube-api-access-7vhms\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.517477 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.517539 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.517576 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.517708 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.619655 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.619765 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.619820 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.619876 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.619954 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.619986 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-config\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.620003 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vhms\" (UniqueName: \"kubernetes.io/projected/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-kube-api-access-7vhms\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.620868 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.620957 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.621036 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.622614 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.622827 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-config\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.622983 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.654393 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vhms\" (UniqueName: \"kubernetes.io/projected/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-kube-api-access-7vhms\") pod \"dnsmasq-dns-dbb88bf8c-6x4hm\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:49 crc kubenswrapper[4756]: I0318 14:25:49.823313 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.566818 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.611797 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-erlang-cookie-secret\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.611847 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-confd\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.613452 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.614844 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-pod-info\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.614911 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-plugins-conf\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.614939 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-erlang-cookie\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.614971 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-config-data\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.615004 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-tls\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.615110 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-plugins\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.615144 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-server-conf\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.615177 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc68v\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-kube-api-access-xc68v\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") " Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.616298 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" (UID: "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.621139 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-kube-api-access-xc68v" (OuterVolumeSpecName: "kube-api-access-xc68v") pod "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" (UID: "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce"). InnerVolumeSpecName "kube-api-access-xc68v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.621232 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" (UID: "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.622035 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-pod-info" (OuterVolumeSpecName: "pod-info") pod "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" (UID: "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.622081 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" (UID: "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.622177 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" (UID: "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.656789 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" (UID: "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.695052 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-config-data" (OuterVolumeSpecName: "config-data") pod "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" (UID: "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.716487 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea" (OuterVolumeSpecName: "persistence") pod "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" (UID: "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce"). InnerVolumeSpecName "pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 14:25:52 crc kubenswrapper[4756]: E0318 14:25:52.717087 4756 reconciler_common.go:156] "operationExecutor.UnmountVolume failed (controllerAttachDetachEnabled true) for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") : UnmountVolume.NewUnmounter failed for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce/volumes/kubernetes.io~csi/pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce/volumes/kubernetes.io~csi/pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea/vol_data.json]: open /var/lib/kubelet/pods/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce/volumes/kubernetes.io~csi/pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea/vol_data.json: no such file or directory" err="UnmountVolume.NewUnmounter failed for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") pod \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\" (UID: \"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce/volumes/kubernetes.io~csi/pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce/volumes/kubernetes.io~csi/pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea/vol_data.json]: open /var/lib/kubelet/pods/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce/volumes/kubernetes.io~csi/pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea/vol_data.json: no such file or directory" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.717765 4756 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.717784 4756 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.717794 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.717803 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.717812 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.717819 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.717828 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc68v\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-kube-api-access-xc68v\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.717835 4756 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.717855 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") on node \"crc\" " Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.749200 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-server-conf" (OuterVolumeSpecName: "server-conf") pod "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" (UID: "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.759396 4756 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.760092 4756 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea") on node "crc" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.819800 4756 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.819840 4756 reconciler_common.go:293] "Volume detached for volume \"pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.866783 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3ba9ee19-92fe-4384-a7d5-dd9a28e15cce","Type":"ContainerDied","Data":"be8855feea12b5d5b568bbf4c2eb5c2979962e937b09985eb3d98a6107dbbe85"} Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.866853 4756 scope.go:117] "RemoveContainer" containerID="2553af4c815e015e2957c0e66da515607ec6a5ad37907645a3db8f52bc3adf5b" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.867029 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 14:25:52 crc kubenswrapper[4756]: I0318 14:25:52.939819 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" (UID: "3ba9ee19-92fe-4384-a7d5-dd9a28e15cce"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.023661 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.232194 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.256725 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.267887 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 14:25:53 crc kubenswrapper[4756]: E0318 14:25:53.268397 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" containerName="setup-container" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.268411 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" containerName="setup-container" Mar 18 14:25:53 crc kubenswrapper[4756]: E0318 14:25:53.268421 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" containerName="rabbitmq" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.268427 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" containerName="rabbitmq" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.268645 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" containerName="rabbitmq" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.269921 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.275788 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.275824 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.276479 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.276618 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-stdr5" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.276726 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.276912 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.277469 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.282821 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.329443 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ba9ee19-92fe-4384-a7d5-dd9a28e15cce" path="/var/lib/kubelet/pods/3ba9ee19-92fe-4384-a7d5-dd9a28e15cce/volumes" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.431632 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc1f1584-6d11-4821-8a1d-4a58648313e3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.431729 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfqpp\" (UniqueName: \"kubernetes.io/projected/cc1f1584-6d11-4821-8a1d-4a58648313e3-kube-api-access-jfqpp\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.431764 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc1f1584-6d11-4821-8a1d-4a58648313e3-config-data\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.431815 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc1f1584-6d11-4821-8a1d-4a58648313e3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.431914 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.431990 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc1f1584-6d11-4821-8a1d-4a58648313e3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.432036 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc1f1584-6d11-4821-8a1d-4a58648313e3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.432063 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc1f1584-6d11-4821-8a1d-4a58648313e3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.432140 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc1f1584-6d11-4821-8a1d-4a58648313e3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.432184 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc1f1584-6d11-4821-8a1d-4a58648313e3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.432244 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc1f1584-6d11-4821-8a1d-4a58648313e3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.533822 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc1f1584-6d11-4821-8a1d-4a58648313e3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.533886 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc1f1584-6d11-4821-8a1d-4a58648313e3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.533936 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc1f1584-6d11-4821-8a1d-4a58648313e3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.533978 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc1f1584-6d11-4821-8a1d-4a58648313e3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.534038 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc1f1584-6d11-4821-8a1d-4a58648313e3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.534083 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc1f1584-6d11-4821-8a1d-4a58648313e3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.534175 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfqpp\" (UniqueName: \"kubernetes.io/projected/cc1f1584-6d11-4821-8a1d-4a58648313e3-kube-api-access-jfqpp\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.534206 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc1f1584-6d11-4821-8a1d-4a58648313e3-config-data\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.534259 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc1f1584-6d11-4821-8a1d-4a58648313e3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.534376 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.534474 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc1f1584-6d11-4821-8a1d-4a58648313e3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.535021 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc1f1584-6d11-4821-8a1d-4a58648313e3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.535057 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc1f1584-6d11-4821-8a1d-4a58648313e3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.535840 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc1f1584-6d11-4821-8a1d-4a58648313e3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.536058 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc1f1584-6d11-4821-8a1d-4a58648313e3-config-data\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.537484 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc1f1584-6d11-4821-8a1d-4a58648313e3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.540433 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc1f1584-6d11-4821-8a1d-4a58648313e3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.540658 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc1f1584-6d11-4821-8a1d-4a58648313e3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.541722 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc1f1584-6d11-4821-8a1d-4a58648313e3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.542427 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.542487 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/38f97d1707111d96e116f25286c985a7c009ee85bc99aa932c406463acfc1268/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.546829 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc1f1584-6d11-4821-8a1d-4a58648313e3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.555877 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfqpp\" (UniqueName: \"kubernetes.io/projected/cc1f1584-6d11-4821-8a1d-4a58648313e3-kube-api-access-jfqpp\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.629468 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1cc0b1cb-aa71-45bd-908f-cf8c6d93b6ea\") pod \"rabbitmq-server-0\" (UID: \"cc1f1584-6d11-4821-8a1d-4a58648313e3\") " pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.836516 4756 scope.go:117] "RemoveContainer" containerID="e66e72f8966d8210856953e9f931b81d3562014f300152eb8832f6595cff013c" Mar 18 14:25:53 crc kubenswrapper[4756]: E0318 14:25:53.849797 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested" Mar 18 14:25:53 crc kubenswrapper[4756]: E0318 14:25:53.849857 4756 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested" Mar 18 14:25:53 crc kubenswrapper[4756]: E0318 14:25:53.849992 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wktnb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-lxm7t_openstack(6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 14:25:53 crc kubenswrapper[4756]: E0318 14:25:53.851654 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-lxm7t" podUID="6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.890599 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"228ca85e-a493-4dc4-9b95-5148c92ba228","Type":"ContainerDied","Data":"720448d87238743264c16f885f312f38ac03efe053862a641da99b7ba5023af5"} Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.890656 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="720448d87238743264c16f885f312f38ac03efe053862a641da99b7ba5023af5" Mar 18 14:25:53 crc kubenswrapper[4756]: E0318 14:25:53.898948 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested\\\"\"" pod="openstack/cloudkitty-db-sync-lxm7t" podUID="6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.900988 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 14:25:53 crc kubenswrapper[4756]: I0318 14:25:53.925743 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.060799 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-erlang-cookie\") pod \"228ca85e-a493-4dc4-9b95-5148c92ba228\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.060843 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-plugins\") pod \"228ca85e-a493-4dc4-9b95-5148c92ba228\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.060866 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-tls\") pod \"228ca85e-a493-4dc4-9b95-5148c92ba228\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.060940 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-server-conf\") pod \"228ca85e-a493-4dc4-9b95-5148c92ba228\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.062415 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\") pod \"228ca85e-a493-4dc4-9b95-5148c92ba228\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.062474 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/228ca85e-a493-4dc4-9b95-5148c92ba228-erlang-cookie-secret\") pod \"228ca85e-a493-4dc4-9b95-5148c92ba228\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.062540 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/228ca85e-a493-4dc4-9b95-5148c92ba228-pod-info\") pod \"228ca85e-a493-4dc4-9b95-5148c92ba228\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.062623 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-config-data\") pod \"228ca85e-a493-4dc4-9b95-5148c92ba228\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.062658 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-confd\") pod \"228ca85e-a493-4dc4-9b95-5148c92ba228\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.062711 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-585hj\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-kube-api-access-585hj\") pod \"228ca85e-a493-4dc4-9b95-5148c92ba228\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.062743 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-plugins-conf\") pod \"228ca85e-a493-4dc4-9b95-5148c92ba228\" (UID: \"228ca85e-a493-4dc4-9b95-5148c92ba228\") " Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.070412 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "228ca85e-a493-4dc4-9b95-5148c92ba228" (UID: "228ca85e-a493-4dc4-9b95-5148c92ba228"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.080509 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/228ca85e-a493-4dc4-9b95-5148c92ba228-pod-info" (OuterVolumeSpecName: "pod-info") pod "228ca85e-a493-4dc4-9b95-5148c92ba228" (UID: "228ca85e-a493-4dc4-9b95-5148c92ba228"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.081041 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "228ca85e-a493-4dc4-9b95-5148c92ba228" (UID: "228ca85e-a493-4dc4-9b95-5148c92ba228"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.090241 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "228ca85e-a493-4dc4-9b95-5148c92ba228" (UID: "228ca85e-a493-4dc4-9b95-5148c92ba228"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.099220 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "228ca85e-a493-4dc4-9b95-5148c92ba228" (UID: "228ca85e-a493-4dc4-9b95-5148c92ba228"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.111690 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228ca85e-a493-4dc4-9b95-5148c92ba228-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "228ca85e-a493-4dc4-9b95-5148c92ba228" (UID: "228ca85e-a493-4dc4-9b95-5148c92ba228"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.117413 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c16bce09-4bdd-4cac-93b2-945569c3acee" (OuterVolumeSpecName: "persistence") pod "228ca85e-a493-4dc4-9b95-5148c92ba228" (UID: "228ca85e-a493-4dc4-9b95-5148c92ba228"). InnerVolumeSpecName "pvc-c16bce09-4bdd-4cac-93b2-945569c3acee". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.118678 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-kube-api-access-585hj" (OuterVolumeSpecName: "kube-api-access-585hj") pod "228ca85e-a493-4dc4-9b95-5148c92ba228" (UID: "228ca85e-a493-4dc4-9b95-5148c92ba228"). InnerVolumeSpecName "kube-api-access-585hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.135807 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-config-data" (OuterVolumeSpecName: "config-data") pod "228ca85e-a493-4dc4-9b95-5148c92ba228" (UID: "228ca85e-a493-4dc4-9b95-5148c92ba228"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.172972 4756 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/228ca85e-a493-4dc4-9b95-5148c92ba228-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.173004 4756 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/228ca85e-a493-4dc4-9b95-5148c92ba228-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.173013 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.173023 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-585hj\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-kube-api-access-585hj\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.173033 4756 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.173043 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.173051 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.173060 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.173090 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\") on node \"crc\" " Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.210890 4756 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.211043 4756 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c16bce09-4bdd-4cac-93b2-945569c3acee" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c16bce09-4bdd-4cac-93b2-945569c3acee") on node "crc" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.212202 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-server-conf" (OuterVolumeSpecName: "server-conf") pod "228ca85e-a493-4dc4-9b95-5148c92ba228" (UID: "228ca85e-a493-4dc4-9b95-5148c92ba228"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.222450 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "228ca85e-a493-4dc4-9b95-5148c92ba228" (UID: "228ca85e-a493-4dc4-9b95-5148c92ba228"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.274427 4756 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/228ca85e-a493-4dc4-9b95-5148c92ba228-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.274455 4756 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/228ca85e-a493-4dc4-9b95-5148c92ba228-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.274475 4756 reconciler_common.go:293] "Volume detached for volume \"pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\") on node \"crc\" DevicePath \"\"" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.350763 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-6x4hm"] Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.622767 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.908160 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc1f1584-6d11-4821-8a1d-4a58648313e3","Type":"ContainerStarted","Data":"28d30334e691ddd104eb8bfa2458210a1456113d3eaa346ad08ceb87a0cf2659"} Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.910967 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.911649 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" event={"ID":"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53","Type":"ContainerStarted","Data":"217eaa954f197e8110c85ed5f4150ffccec0c41a2f305d03faa6117c9619abfc"} Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.911670 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" event={"ID":"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53","Type":"ContainerStarted","Data":"fdb68db1b4821416d1534fb5b3fc6a003ed8dd3830439df9894d8b08ef99c0ab"} Mar 18 14:25:54 crc kubenswrapper[4756]: I0318 14:25:54.990180 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.000455 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.021389 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 14:25:55 crc kubenswrapper[4756]: E0318 14:25:55.021821 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228ca85e-a493-4dc4-9b95-5148c92ba228" containerName="rabbitmq" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.021837 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="228ca85e-a493-4dc4-9b95-5148c92ba228" containerName="rabbitmq" Mar 18 14:25:55 crc kubenswrapper[4756]: E0318 14:25:55.021849 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228ca85e-a493-4dc4-9b95-5148c92ba228" containerName="setup-container" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.021855 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="228ca85e-a493-4dc4-9b95-5148c92ba228" containerName="setup-container" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.022286 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="228ca85e-a493-4dc4-9b95-5148c92ba228" containerName="rabbitmq" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.024274 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.025994 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.028249 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.028384 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.028520 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.028768 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6nmfp" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.028916 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.029043 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.046022 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.094442 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.094512 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.094535 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.094551 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.094596 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.094616 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.094658 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.094701 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8v8q\" (UniqueName: \"kubernetes.io/projected/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-kube-api-access-k8v8q\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.094766 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.094787 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.094813 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.196473 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.196519 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.196560 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.196598 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8v8q\" (UniqueName: \"kubernetes.io/projected/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-kube-api-access-k8v8q\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.196660 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.196683 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.196709 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.196732 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.196773 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.196792 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.196815 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.197680 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.197925 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.198174 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.198243 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.198730 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.203071 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.203237 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.203499 4756 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.203532 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/74db1f4c9ab2a4ba86b05e72f95f0efbae4c6a7d3cd1801a16cdc636a751481a/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.210965 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.211785 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.220160 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8v8q\" (UniqueName: \"kubernetes.io/projected/c6dd5f14-94cd-4fee-9798-8c93d27de8b9-kube-api-access-k8v8q\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.252311 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c16bce09-4bdd-4cac-93b2-945569c3acee\") pod \"rabbitmq-cell1-server-0\" (UID: \"c6dd5f14-94cd-4fee-9798-8c93d27de8b9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.335849 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="228ca85e-a493-4dc4-9b95-5148c92ba228" path="/var/lib/kubelet/pods/228ca85e-a493-4dc4-9b95-5148c92ba228/volumes" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.356428 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.836718 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.921795 4756 generic.go:334] "Generic (PLEG): container finished" podID="4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" containerID="217eaa954f197e8110c85ed5f4150ffccec0c41a2f305d03faa6117c9619abfc" exitCode=0 Mar 18 14:25:55 crc kubenswrapper[4756]: I0318 14:25:55.921855 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" event={"ID":"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53","Type":"ContainerDied","Data":"217eaa954f197e8110c85ed5f4150ffccec0c41a2f305d03faa6117c9619abfc"} Mar 18 14:25:56 crc kubenswrapper[4756]: I0318 14:25:56.937667 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc1f1584-6d11-4821-8a1d-4a58648313e3","Type":"ContainerStarted","Data":"10994d016fdf578c85c6dbdb894ffc6b58c32919151e2af5c9880cd91fd14752"} Mar 18 14:25:58 crc kubenswrapper[4756]: W0318 14:25:58.413553 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6dd5f14_94cd_4fee_9798_8c93d27de8b9.slice/crio-94b73da06864238f4a97c6512739cc3de99402e1ab02eba36764c92f21782b98 WatchSource:0}: Error finding container 94b73da06864238f4a97c6512739cc3de99402e1ab02eba36764c92f21782b98: Status 404 returned error can't find the container with id 94b73da06864238f4a97c6512739cc3de99402e1ab02eba36764c92f21782b98 Mar 18 14:25:58 crc kubenswrapper[4756]: I0318 14:25:58.960516 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c6dd5f14-94cd-4fee-9798-8c93d27de8b9","Type":"ContainerStarted","Data":"94b73da06864238f4a97c6512739cc3de99402e1ab02eba36764c92f21782b98"} Mar 18 14:25:58 crc kubenswrapper[4756]: I0318 14:25:58.963390 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" event={"ID":"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53","Type":"ContainerStarted","Data":"440dba5181c581244648d46aba4399c69e76d9684296449d133248c1a80f3ff1"} Mar 18 14:25:58 crc kubenswrapper[4756]: I0318 14:25:58.963582 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:25:58 crc kubenswrapper[4756]: I0318 14:25:58.965179 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a36d4c-c545-467d-a0a8-0aa38de63eb1","Type":"ContainerStarted","Data":"295e728ddd4909e059b2a3ffcaa3a80abeb5d3863246ec4d6a777185bd895432"} Mar 18 14:25:58 crc kubenswrapper[4756]: I0318 14:25:58.985335 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" podStartSLOduration=9.985318352 podStartE2EDuration="9.985318352s" podCreationTimestamp="2026-03-18 14:25:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:25:58.984061939 +0000 UTC m=+1560.298479914" watchObservedRunningTime="2026-03-18 14:25:58.985318352 +0000 UTC m=+1560.299736327" Mar 18 14:25:59 crc kubenswrapper[4756]: I0318 14:25:59.979634 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a36d4c-c545-467d-a0a8-0aa38de63eb1","Type":"ContainerStarted","Data":"a64d5efe870771cb3aae02ec95c33ac106c5d1ebc96c55500b02fc2205678a41"} Mar 18 14:26:00 crc kubenswrapper[4756]: I0318 14:26:00.135463 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564066-spbj4"] Mar 18 14:26:00 crc kubenswrapper[4756]: I0318 14:26:00.136906 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564066-spbj4" Mar 18 14:26:00 crc kubenswrapper[4756]: I0318 14:26:00.140924 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:26:00 crc kubenswrapper[4756]: I0318 14:26:00.141469 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:26:00 crc kubenswrapper[4756]: I0318 14:26:00.141495 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:26:00 crc kubenswrapper[4756]: I0318 14:26:00.149625 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564066-spbj4"] Mar 18 14:26:00 crc kubenswrapper[4756]: I0318 14:26:00.299238 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgq84\" (UniqueName: \"kubernetes.io/projected/e78e70b6-ea00-49f6-b5d9-8695cffcad06-kube-api-access-pgq84\") pod \"auto-csr-approver-29564066-spbj4\" (UID: \"e78e70b6-ea00-49f6-b5d9-8695cffcad06\") " pod="openshift-infra/auto-csr-approver-29564066-spbj4" Mar 18 14:26:00 crc kubenswrapper[4756]: I0318 14:26:00.401183 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgq84\" (UniqueName: \"kubernetes.io/projected/e78e70b6-ea00-49f6-b5d9-8695cffcad06-kube-api-access-pgq84\") pod \"auto-csr-approver-29564066-spbj4\" (UID: \"e78e70b6-ea00-49f6-b5d9-8695cffcad06\") " pod="openshift-infra/auto-csr-approver-29564066-spbj4" Mar 18 14:26:00 crc kubenswrapper[4756]: I0318 14:26:00.424048 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgq84\" (UniqueName: \"kubernetes.io/projected/e78e70b6-ea00-49f6-b5d9-8695cffcad06-kube-api-access-pgq84\") pod \"auto-csr-approver-29564066-spbj4\" (UID: \"e78e70b6-ea00-49f6-b5d9-8695cffcad06\") " pod="openshift-infra/auto-csr-approver-29564066-spbj4" Mar 18 14:26:00 crc kubenswrapper[4756]: I0318 14:26:00.451074 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564066-spbj4" Mar 18 14:26:00 crc kubenswrapper[4756]: I0318 14:26:00.895754 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564066-spbj4"] Mar 18 14:26:00 crc kubenswrapper[4756]: W0318 14:26:00.898286 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode78e70b6_ea00_49f6_b5d9_8695cffcad06.slice/crio-855fb49073086965d3cea486b5a312d05f1407f5d11306ab873c067725dc7357 WatchSource:0}: Error finding container 855fb49073086965d3cea486b5a312d05f1407f5d11306ab873c067725dc7357: Status 404 returned error can't find the container with id 855fb49073086965d3cea486b5a312d05f1407f5d11306ab873c067725dc7357 Mar 18 14:26:00 crc kubenswrapper[4756]: I0318 14:26:00.996965 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c6dd5f14-94cd-4fee-9798-8c93d27de8b9","Type":"ContainerStarted","Data":"34910de5f6986d8529ca30a27f18c41e30e99856baa499ec5f81eb2ee529b25d"} Mar 18 14:26:01 crc kubenswrapper[4756]: I0318 14:26:01.009864 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564066-spbj4" event={"ID":"e78e70b6-ea00-49f6-b5d9-8695cffcad06","Type":"ContainerStarted","Data":"855fb49073086965d3cea486b5a312d05f1407f5d11306ab873c067725dc7357"} Mar 18 14:26:01 crc kubenswrapper[4756]: I0318 14:26:01.011519 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a36d4c-c545-467d-a0a8-0aa38de63eb1","Type":"ContainerStarted","Data":"095ecd9c37954dfcc6e0399a449e04f0fc3e9967f24843bc3bf8337d552afca4"} Mar 18 14:26:04 crc kubenswrapper[4756]: I0318 14:26:04.052350 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a36d4c-c545-467d-a0a8-0aa38de63eb1","Type":"ContainerStarted","Data":"8d1caf0e860986c5abd1f0b6850c9e5d8d7350a19c56973625bea700acbbaa30"} Mar 18 14:26:04 crc kubenswrapper[4756]: I0318 14:26:04.054179 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 14:26:04 crc kubenswrapper[4756]: I0318 14:26:04.083989 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.851076394 podStartE2EDuration="25.083968275s" podCreationTimestamp="2026-03-18 14:25:39 +0000 UTC" firstStartedPulling="2026-03-18 14:25:45.97071249 +0000 UTC m=+1547.285130465" lastFinishedPulling="2026-03-18 14:26:03.203604381 +0000 UTC m=+1564.518022346" observedRunningTime="2026-03-18 14:26:04.078402635 +0000 UTC m=+1565.392820630" watchObservedRunningTime="2026-03-18 14:26:04.083968275 +0000 UTC m=+1565.398386250" Mar 18 14:26:04 crc kubenswrapper[4756]: I0318 14:26:04.824318 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:26:04 crc kubenswrapper[4756]: I0318 14:26:04.880510 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-t92ls"] Mar 18 14:26:04 crc kubenswrapper[4756]: I0318 14:26:04.880725 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" podUID="a6501734-2465-4925-8b50-4b7762bb9c4e" containerName="dnsmasq-dns" containerID="cri-o://f6d1c75a36cfe5377d574a829181cc49563d936b9123fdda423c340b1fc8f637" gracePeriod=10 Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.048797 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-9qbkr"] Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.050899 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.061717 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-9qbkr"] Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.104881 4756 generic.go:334] "Generic (PLEG): container finished" podID="a6501734-2465-4925-8b50-4b7762bb9c4e" containerID="f6d1c75a36cfe5377d574a829181cc49563d936b9123fdda423c340b1fc8f637" exitCode=0 Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.105803 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" event={"ID":"a6501734-2465-4925-8b50-4b7762bb9c4e","Type":"ContainerDied","Data":"f6d1c75a36cfe5377d574a829181cc49563d936b9123fdda423c340b1fc8f637"} Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.211239 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.211361 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.211409 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-config\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.211433 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.211473 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.211495 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-dns-svc\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.211602 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcqpn\" (UniqueName: \"kubernetes.io/projected/8f23b254-5d70-4ac0-948f-cfa6974416fd-kube-api-access-bcqpn\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.313753 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.313798 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-dns-svc\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.313885 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcqpn\" (UniqueName: \"kubernetes.io/projected/8f23b254-5d70-4ac0-948f-cfa6974416fd-kube-api-access-bcqpn\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.313931 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.313980 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.314015 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-config\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.314038 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.314751 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.314844 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.315728 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.315750 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.320569 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-dns-svc\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.320685 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f23b254-5d70-4ac0-948f-cfa6974416fd-config\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.337531 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcqpn\" (UniqueName: \"kubernetes.io/projected/8f23b254-5d70-4ac0-948f-cfa6974416fd-kube-api-access-bcqpn\") pod \"dnsmasq-dns-85f64749dc-9qbkr\" (UID: \"8f23b254-5d70-4ac0-948f-cfa6974416fd\") " pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.396948 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:05 crc kubenswrapper[4756]: I0318 14:26:05.932263 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.050843 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-dns-swift-storage-0\") pod \"a6501734-2465-4925-8b50-4b7762bb9c4e\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.051228 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnxgx\" (UniqueName: \"kubernetes.io/projected/a6501734-2465-4925-8b50-4b7762bb9c4e-kube-api-access-gnxgx\") pod \"a6501734-2465-4925-8b50-4b7762bb9c4e\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.051286 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-ovsdbserver-nb\") pod \"a6501734-2465-4925-8b50-4b7762bb9c4e\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.051340 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-config\") pod \"a6501734-2465-4925-8b50-4b7762bb9c4e\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.051444 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-ovsdbserver-sb\") pod \"a6501734-2465-4925-8b50-4b7762bb9c4e\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.051477 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-dns-svc\") pod \"a6501734-2465-4925-8b50-4b7762bb9c4e\" (UID: \"a6501734-2465-4925-8b50-4b7762bb9c4e\") " Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.067926 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6501734-2465-4925-8b50-4b7762bb9c4e-kube-api-access-gnxgx" (OuterVolumeSpecName: "kube-api-access-gnxgx") pod "a6501734-2465-4925-8b50-4b7762bb9c4e" (UID: "a6501734-2465-4925-8b50-4b7762bb9c4e"). InnerVolumeSpecName "kube-api-access-gnxgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.155152 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnxgx\" (UniqueName: \"kubernetes.io/projected/a6501734-2465-4925-8b50-4b7762bb9c4e-kube-api-access-gnxgx\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.155769 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.156321 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-t92ls" event={"ID":"a6501734-2465-4925-8b50-4b7762bb9c4e","Type":"ContainerDied","Data":"8c858785a0a1cbe3e47863b0c4d02f4ed597bac0981c1ba7154779d1eee3268d"} Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.156372 4756 scope.go:117] "RemoveContainer" containerID="f6d1c75a36cfe5377d574a829181cc49563d936b9123fdda423c340b1fc8f637" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.187521 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-config" (OuterVolumeSpecName: "config") pod "a6501734-2465-4925-8b50-4b7762bb9c4e" (UID: "a6501734-2465-4925-8b50-4b7762bb9c4e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.190784 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a6501734-2465-4925-8b50-4b7762bb9c4e" (UID: "a6501734-2465-4925-8b50-4b7762bb9c4e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.211387 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a6501734-2465-4925-8b50-4b7762bb9c4e" (UID: "a6501734-2465-4925-8b50-4b7762bb9c4e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.211748 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6501734-2465-4925-8b50-4b7762bb9c4e" (UID: "a6501734-2465-4925-8b50-4b7762bb9c4e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.235435 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a6501734-2465-4925-8b50-4b7762bb9c4e" (UID: "a6501734-2465-4925-8b50-4b7762bb9c4e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.260260 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.260476 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.260571 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.260654 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.260732 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6501734-2465-4925-8b50-4b7762bb9c4e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.279480 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-9qbkr"] Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.333586 4756 scope.go:117] "RemoveContainer" containerID="5798c74bf8a6bab35023505488160932c01cf866f900bc391dd7402af5a8b2e8" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.490168 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-t92ls"] Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.500555 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-t92ls"] Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.914982 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.915046 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.915092 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.915870 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"434d4b042e3195c964b5c61982de1a71dbd601937856a288545bb36d7cbe0017"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:26:06 crc kubenswrapper[4756]: I0318 14:26:06.915939 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://434d4b042e3195c964b5c61982de1a71dbd601937856a288545bb36d7cbe0017" gracePeriod=600 Mar 18 14:26:07 crc kubenswrapper[4756]: I0318 14:26:07.166450 4756 generic.go:334] "Generic (PLEG): container finished" podID="8f23b254-5d70-4ac0-948f-cfa6974416fd" containerID="bfc1baf6c3f9ab8cb15d953cf107f4981d26db4770602b415ce0b4e4194a46f5" exitCode=0 Mar 18 14:26:07 crc kubenswrapper[4756]: I0318 14:26:07.166514 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" event={"ID":"8f23b254-5d70-4ac0-948f-cfa6974416fd","Type":"ContainerDied","Data":"bfc1baf6c3f9ab8cb15d953cf107f4981d26db4770602b415ce0b4e4194a46f5"} Mar 18 14:26:07 crc kubenswrapper[4756]: I0318 14:26:07.166729 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" event={"ID":"8f23b254-5d70-4ac0-948f-cfa6974416fd","Type":"ContainerStarted","Data":"b7823d3465131862153e9c5a8e63143846afff0a5f767810cbe462308ceb243f"} Mar 18 14:26:07 crc kubenswrapper[4756]: I0318 14:26:07.188034 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="434d4b042e3195c964b5c61982de1a71dbd601937856a288545bb36d7cbe0017" exitCode=0 Mar 18 14:26:07 crc kubenswrapper[4756]: I0318 14:26:07.188079 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"434d4b042e3195c964b5c61982de1a71dbd601937856a288545bb36d7cbe0017"} Mar 18 14:26:07 crc kubenswrapper[4756]: I0318 14:26:07.188129 4756 scope.go:117] "RemoveContainer" containerID="9727a0a3407fffdca1e788ab9dbb2c6a316b6b73611747d0e9dff150fec50fa4" Mar 18 14:26:07 crc kubenswrapper[4756]: I0318 14:26:07.328697 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6501734-2465-4925-8b50-4b7762bb9c4e" path="/var/lib/kubelet/pods/a6501734-2465-4925-8b50-4b7762bb9c4e/volumes" Mar 18 14:26:08 crc kubenswrapper[4756]: I0318 14:26:08.198689 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323"} Mar 18 14:26:08 crc kubenswrapper[4756]: I0318 14:26:08.200519 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" event={"ID":"8f23b254-5d70-4ac0-948f-cfa6974416fd","Type":"ContainerStarted","Data":"d4f627ff6370563e0d65d62ca0fe36faa03d7949194cecd0d96d8fdd25a438e3"} Mar 18 14:26:08 crc kubenswrapper[4756]: I0318 14:26:08.200747 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:08 crc kubenswrapper[4756]: I0318 14:26:08.261520 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" podStartSLOduration=3.261504321 podStartE2EDuration="3.261504321s" podCreationTimestamp="2026-03-18 14:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:26:08.253212767 +0000 UTC m=+1569.567630742" watchObservedRunningTime="2026-03-18 14:26:08.261504321 +0000 UTC m=+1569.575922296" Mar 18 14:26:08 crc kubenswrapper[4756]: I0318 14:26:08.862987 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 14:26:09 crc kubenswrapper[4756]: I0318 14:26:09.210188 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564066-spbj4" event={"ID":"e78e70b6-ea00-49f6-b5d9-8695cffcad06","Type":"ContainerStarted","Data":"d6ebd677533e1c0881c300219bdb060483ab11e6568210358df4233c1599fac8"} Mar 18 14:26:09 crc kubenswrapper[4756]: I0318 14:26:09.211875 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-lxm7t" event={"ID":"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2","Type":"ContainerStarted","Data":"75adf43572dee2082d88455761472c1f390fa1535eaf69f33930b347c9f759cb"} Mar 18 14:26:09 crc kubenswrapper[4756]: I0318 14:26:09.244582 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564066-spbj4" podStartSLOduration=1.832854556 podStartE2EDuration="9.244561072s" podCreationTimestamp="2026-03-18 14:26:00 +0000 UTC" firstStartedPulling="2026-03-18 14:26:00.901173595 +0000 UTC m=+1562.215591570" lastFinishedPulling="2026-03-18 14:26:08.312880111 +0000 UTC m=+1569.627298086" observedRunningTime="2026-03-18 14:26:09.224395467 +0000 UTC m=+1570.538813442" watchObservedRunningTime="2026-03-18 14:26:09.244561072 +0000 UTC m=+1570.558979047" Mar 18 14:26:11 crc kubenswrapper[4756]: I0318 14:26:11.231539 4756 generic.go:334] "Generic (PLEG): container finished" podID="e78e70b6-ea00-49f6-b5d9-8695cffcad06" containerID="d6ebd677533e1c0881c300219bdb060483ab11e6568210358df4233c1599fac8" exitCode=0 Mar 18 14:26:11 crc kubenswrapper[4756]: I0318 14:26:11.231620 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564066-spbj4" event={"ID":"e78e70b6-ea00-49f6-b5d9-8695cffcad06","Type":"ContainerDied","Data":"d6ebd677533e1c0881c300219bdb060483ab11e6568210358df4233c1599fac8"} Mar 18 14:26:11 crc kubenswrapper[4756]: I0318 14:26:11.234091 4756 generic.go:334] "Generic (PLEG): container finished" podID="6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2" containerID="75adf43572dee2082d88455761472c1f390fa1535eaf69f33930b347c9f759cb" exitCode=0 Mar 18 14:26:11 crc kubenswrapper[4756]: I0318 14:26:11.234128 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-lxm7t" event={"ID":"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2","Type":"ContainerDied","Data":"75adf43572dee2082d88455761472c1f390fa1535eaf69f33930b347c9f759cb"} Mar 18 14:26:11 crc kubenswrapper[4756]: I0318 14:26:11.249381 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-lxm7t" podStartSLOduration=4.026341766 podStartE2EDuration="39.249364241s" podCreationTimestamp="2026-03-18 14:25:32 +0000 UTC" firstStartedPulling="2026-03-18 14:25:33.636709422 +0000 UTC m=+1534.951127397" lastFinishedPulling="2026-03-18 14:26:08.859731887 +0000 UTC m=+1570.174149872" observedRunningTime="2026-03-18 14:26:09.273533946 +0000 UTC m=+1570.587951921" watchObservedRunningTime="2026-03-18 14:26:11.249364241 +0000 UTC m=+1572.563782216" Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.255208 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564066-spbj4" event={"ID":"e78e70b6-ea00-49f6-b5d9-8695cffcad06","Type":"ContainerDied","Data":"855fb49073086965d3cea486b5a312d05f1407f5d11306ab873c067725dc7357"} Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.256442 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="855fb49073086965d3cea486b5a312d05f1407f5d11306ab873c067725dc7357" Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.365675 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564066-spbj4" Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.374303 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.508775 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgq84\" (UniqueName: \"kubernetes.io/projected/e78e70b6-ea00-49f6-b5d9-8695cffcad06-kube-api-access-pgq84\") pod \"e78e70b6-ea00-49f6-b5d9-8695cffcad06\" (UID: \"e78e70b6-ea00-49f6-b5d9-8695cffcad06\") " Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.509073 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-scripts\") pod \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.509241 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wktnb\" (UniqueName: \"kubernetes.io/projected/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-kube-api-access-wktnb\") pod \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.509523 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-combined-ca-bundle\") pod \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.509672 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-config-data\") pod \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.509821 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-certs\") pod \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\" (UID: \"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2\") " Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.986968 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e78e70b6-ea00-49f6-b5d9-8695cffcad06-kube-api-access-pgq84" (OuterVolumeSpecName: "kube-api-access-pgq84") pod "e78e70b6-ea00-49f6-b5d9-8695cffcad06" (UID: "e78e70b6-ea00-49f6-b5d9-8695cffcad06"). InnerVolumeSpecName "kube-api-access-pgq84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.987265 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-scripts" (OuterVolumeSpecName: "scripts") pod "6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2" (UID: "6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.991522 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-kube-api-access-wktnb" (OuterVolumeSpecName: "kube-api-access-wktnb") pod "6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2" (UID: "6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2"). InnerVolumeSpecName "kube-api-access-wktnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.991593 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-certs" (OuterVolumeSpecName: "certs") pod "6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2" (UID: "6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.993398 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2" (UID: "6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:13 crc kubenswrapper[4756]: I0318 14:26:13.993518 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-config-data" (OuterVolumeSpecName: "config-data") pod "6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2" (UID: "6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:14 crc kubenswrapper[4756]: I0318 14:26:14.087943 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:14 crc kubenswrapper[4756]: I0318 14:26:14.088008 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:14 crc kubenswrapper[4756]: I0318 14:26:14.088024 4756 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:14 crc kubenswrapper[4756]: I0318 14:26:14.088040 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgq84\" (UniqueName: \"kubernetes.io/projected/e78e70b6-ea00-49f6-b5d9-8695cffcad06-kube-api-access-pgq84\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:14 crc kubenswrapper[4756]: I0318 14:26:14.088053 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:14 crc kubenswrapper[4756]: I0318 14:26:14.088091 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wktnb\" (UniqueName: \"kubernetes.io/projected/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2-kube-api-access-wktnb\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:14 crc kubenswrapper[4756]: I0318 14:26:14.264877 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564066-spbj4" Mar 18 14:26:14 crc kubenswrapper[4756]: I0318 14:26:14.264881 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-lxm7t" event={"ID":"6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2","Type":"ContainerDied","Data":"e2b2c7369ced4c238693dbe6913e6211885f8d0a269eaaa60340823e10cebdc7"} Mar 18 14:26:14 crc kubenswrapper[4756]: I0318 14:26:14.264924 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-lxm7t" Mar 18 14:26:14 crc kubenswrapper[4756]: I0318 14:26:14.264937 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2b2c7369ced4c238693dbe6913e6211885f8d0a269eaaa60340823e10cebdc7" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.056948 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564060-mkctk"] Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.087377 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564060-mkctk"] Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.112340 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-4gwt6"] Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.126628 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-4gwt6"] Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.192671 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-djhlg"] Mar 18 14:26:15 crc kubenswrapper[4756]: E0318 14:26:15.193102 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6501734-2465-4925-8b50-4b7762bb9c4e" containerName="init" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.193134 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6501734-2465-4925-8b50-4b7762bb9c4e" containerName="init" Mar 18 14:26:15 crc kubenswrapper[4756]: E0318 14:26:15.193145 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2" containerName="cloudkitty-db-sync" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.193152 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2" containerName="cloudkitty-db-sync" Mar 18 14:26:15 crc kubenswrapper[4756]: E0318 14:26:15.193180 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78e70b6-ea00-49f6-b5d9-8695cffcad06" containerName="oc" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.193188 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78e70b6-ea00-49f6-b5d9-8695cffcad06" containerName="oc" Mar 18 14:26:15 crc kubenswrapper[4756]: E0318 14:26:15.193199 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6501734-2465-4925-8b50-4b7762bb9c4e" containerName="dnsmasq-dns" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.193205 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6501734-2465-4925-8b50-4b7762bb9c4e" containerName="dnsmasq-dns" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.193378 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6501734-2465-4925-8b50-4b7762bb9c4e" containerName="dnsmasq-dns" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.193397 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78e70b6-ea00-49f6-b5d9-8695cffcad06" containerName="oc" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.193407 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2" containerName="cloudkitty-db-sync" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.194100 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.196346 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.214356 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b12455cd-3670-4081-8e78-d2088ac075cc-certs\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.214442 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-combined-ca-bundle\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.214474 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-scripts\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.214490 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-config-data\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.214576 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czrrs\" (UniqueName: \"kubernetes.io/projected/b12455cd-3670-4081-8e78-d2088ac075cc-kube-api-access-czrrs\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.219470 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-djhlg"] Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.316216 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b12455cd-3670-4081-8e78-d2088ac075cc-certs\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.316337 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-combined-ca-bundle\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.316372 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-scripts\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.316393 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-config-data\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.316447 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czrrs\" (UniqueName: \"kubernetes.io/projected/b12455cd-3670-4081-8e78-d2088ac075cc-kube-api-access-czrrs\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.322776 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b12455cd-3670-4081-8e78-d2088ac075cc-certs\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.322985 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-config-data\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.325211 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-combined-ca-bundle\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.325572 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-scripts\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.329933 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7001fa44-03c0-4a84-aa50-3005a9c4e1ed" path="/var/lib/kubelet/pods/7001fa44-03c0-4a84-aa50-3005a9c4e1ed/volumes" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.331985 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4432811-b291-4fac-a2e6-ad17c9d83f51" path="/var/lib/kubelet/pods/d4432811-b291-4fac-a2e6-ad17c9d83f51/volumes" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.336263 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czrrs\" (UniqueName: \"kubernetes.io/projected/b12455cd-3670-4081-8e78-d2088ac075cc-kube-api-access-czrrs\") pod \"cloudkitty-storageinit-djhlg\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.398381 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f64749dc-9qbkr" Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.467092 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-6x4hm"] Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.467367 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" podUID="4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" containerName="dnsmasq-dns" containerID="cri-o://440dba5181c581244648d46aba4399c69e76d9684296449d133248c1a80f3ff1" gracePeriod=10 Mar 18 14:26:15 crc kubenswrapper[4756]: I0318 14:26:15.514057 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.235177 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-djhlg"] Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.285525 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-djhlg" event={"ID":"b12455cd-3670-4081-8e78-d2088ac075cc","Type":"ContainerStarted","Data":"de4f07f7fddcd1f533b712fe7e1b24740438a0e67ae8eca84162f8512e0a93b0"} Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.306593 4756 generic.go:334] "Generic (PLEG): container finished" podID="4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" containerID="440dba5181c581244648d46aba4399c69e76d9684296449d133248c1a80f3ff1" exitCode=0 Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.306652 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" event={"ID":"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53","Type":"ContainerDied","Data":"440dba5181c581244648d46aba4399c69e76d9684296449d133248c1a80f3ff1"} Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.719489 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.853463 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-dns-svc\") pod \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.853551 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-openstack-edpm-ipam\") pod \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.853623 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-dns-swift-storage-0\") pod \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.853675 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vhms\" (UniqueName: \"kubernetes.io/projected/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-kube-api-access-7vhms\") pod \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.853763 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-config\") pod \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.853829 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-ovsdbserver-sb\") pod \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.853945 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-ovsdbserver-nb\") pod \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\" (UID: \"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53\") " Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.868808 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-kube-api-access-7vhms" (OuterVolumeSpecName: "kube-api-access-7vhms") pod "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" (UID: "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53"). InnerVolumeSpecName "kube-api-access-7vhms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.936708 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" (UID: "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.937391 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-config" (OuterVolumeSpecName: "config") pod "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" (UID: "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.938635 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" (UID: "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.940702 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" (UID: "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.958083 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.958150 4756 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.958168 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.958180 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vhms\" (UniqueName: \"kubernetes.io/projected/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-kube-api-access-7vhms\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.958189 4756 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.964650 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" (UID: "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:26:16 crc kubenswrapper[4756]: I0318 14:26:16.965190 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" (UID: "4c9a9f66-26c1-4e32-9a6e-d74d63e5be53"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:26:17 crc kubenswrapper[4756]: I0318 14:26:17.059913 4756 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:17 crc kubenswrapper[4756]: I0318 14:26:17.059950 4756 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:17 crc kubenswrapper[4756]: I0318 14:26:17.322302 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:26:17 crc kubenswrapper[4756]: I0318 14:26:17.328997 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-djhlg" event={"ID":"b12455cd-3670-4081-8e78-d2088ac075cc","Type":"ContainerStarted","Data":"23a0f64851bd0910ace15275d32efbd0fc9768ae29b909441bfcf60b1f7ee25f"} Mar 18 14:26:17 crc kubenswrapper[4756]: I0318 14:26:17.329048 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" event={"ID":"4c9a9f66-26c1-4e32-9a6e-d74d63e5be53","Type":"ContainerDied","Data":"fdb68db1b4821416d1534fb5b3fc6a003ed8dd3830439df9894d8b08ef99c0ab"} Mar 18 14:26:17 crc kubenswrapper[4756]: I0318 14:26:17.329081 4756 scope.go:117] "RemoveContainer" containerID="440dba5181c581244648d46aba4399c69e76d9684296449d133248c1a80f3ff1" Mar 18 14:26:17 crc kubenswrapper[4756]: I0318 14:26:17.342348 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-djhlg" podStartSLOduration=2.342332259 podStartE2EDuration="2.342332259s" podCreationTimestamp="2026-03-18 14:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:26:17.335766552 +0000 UTC m=+1578.650184557" watchObservedRunningTime="2026-03-18 14:26:17.342332259 +0000 UTC m=+1578.656750234" Mar 18 14:26:17 crc kubenswrapper[4756]: I0318 14:26:17.413537 4756 scope.go:117] "RemoveContainer" containerID="217eaa954f197e8110c85ed5f4150ffccec0c41a2f305d03faa6117c9619abfc" Mar 18 14:26:18 crc kubenswrapper[4756]: I0318 14:26:18.333869 4756 generic.go:334] "Generic (PLEG): container finished" podID="b12455cd-3670-4081-8e78-d2088ac075cc" containerID="23a0f64851bd0910ace15275d32efbd0fc9768ae29b909441bfcf60b1f7ee25f" exitCode=0 Mar 18 14:26:18 crc kubenswrapper[4756]: I0318 14:26:18.333980 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-djhlg" event={"ID":"b12455cd-3670-4081-8e78-d2088ac075cc","Type":"ContainerDied","Data":"23a0f64851bd0910ace15275d32efbd0fc9768ae29b909441bfcf60b1f7ee25f"} Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.136055 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.330736 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czrrs\" (UniqueName: \"kubernetes.io/projected/b12455cd-3670-4081-8e78-d2088ac075cc-kube-api-access-czrrs\") pod \"b12455cd-3670-4081-8e78-d2088ac075cc\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.330786 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b12455cd-3670-4081-8e78-d2088ac075cc-certs\") pod \"b12455cd-3670-4081-8e78-d2088ac075cc\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.330810 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-combined-ca-bundle\") pod \"b12455cd-3670-4081-8e78-d2088ac075cc\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.331156 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-config-data\") pod \"b12455cd-3670-4081-8e78-d2088ac075cc\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.331225 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-scripts\") pod \"b12455cd-3670-4081-8e78-d2088ac075cc\" (UID: \"b12455cd-3670-4081-8e78-d2088ac075cc\") " Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.343340 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12455cd-3670-4081-8e78-d2088ac075cc-kube-api-access-czrrs" (OuterVolumeSpecName: "kube-api-access-czrrs") pod "b12455cd-3670-4081-8e78-d2088ac075cc" (UID: "b12455cd-3670-4081-8e78-d2088ac075cc"). InnerVolumeSpecName "kube-api-access-czrrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.343489 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12455cd-3670-4081-8e78-d2088ac075cc-certs" (OuterVolumeSpecName: "certs") pod "b12455cd-3670-4081-8e78-d2088ac075cc" (UID: "b12455cd-3670-4081-8e78-d2088ac075cc"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.350299 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-scripts" (OuterVolumeSpecName: "scripts") pod "b12455cd-3670-4081-8e78-d2088ac075cc" (UID: "b12455cd-3670-4081-8e78-d2088ac075cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.362631 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-djhlg" event={"ID":"b12455cd-3670-4081-8e78-d2088ac075cc","Type":"ContainerDied","Data":"de4f07f7fddcd1f533b712fe7e1b24740438a0e67ae8eca84162f8512e0a93b0"} Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.362674 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de4f07f7fddcd1f533b712fe7e1b24740438a0e67ae8eca84162f8512e0a93b0" Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.362737 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-djhlg" Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.368889 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b12455cd-3670-4081-8e78-d2088ac075cc" (UID: "b12455cd-3670-4081-8e78-d2088ac075cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.371572 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-config-data" (OuterVolumeSpecName: "config-data") pod "b12455cd-3670-4081-8e78-d2088ac075cc" (UID: "b12455cd-3670-4081-8e78-d2088ac075cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.433106 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.433351 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="3e634fb0-be69-4f7f-8cfa-1d9a1ef54459" containerName="cloudkitty-proc" containerID="cri-o://dc58a10256720f85ecc79985639090d2de00d69ed7451735905f7e194e7d28e0" gracePeriod=30 Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.433418 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czrrs\" (UniqueName: \"kubernetes.io/projected/b12455cd-3670-4081-8e78-d2088ac075cc-kube-api-access-czrrs\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.433497 4756 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b12455cd-3670-4081-8e78-d2088ac075cc-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.433511 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.433522 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.433534 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12455cd-3670-4081-8e78-d2088ac075cc-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.450426 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.450657 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="cd47c362-dbd1-40b7-8f54-b967d5998fcd" containerName="cloudkitty-api-log" containerID="cri-o://a6651cfd2b97688c0bf583665883ac7fd8b8ad929778ecd5484a983e72694ca7" gracePeriod=30 Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.450767 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="cd47c362-dbd1-40b7-8f54-b967d5998fcd" containerName="cloudkitty-api" containerID="cri-o://2dac5e6313bd5acf6f685e0a53c4526e61552cb8e569499e8928466e7cc65afc" gracePeriod=30 Mar 18 14:26:20 crc kubenswrapper[4756]: I0318 14:26:20.985503 4756 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="cd47c362-dbd1-40b7-8f54-b967d5998fcd" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.196:8889/healthcheck\": dial tcp 10.217.0.196:8889: connect: connection refused" Mar 18 14:26:21 crc kubenswrapper[4756]: I0318 14:26:21.375913 4756 generic.go:334] "Generic (PLEG): container finished" podID="3e634fb0-be69-4f7f-8cfa-1d9a1ef54459" containerID="dc58a10256720f85ecc79985639090d2de00d69ed7451735905f7e194e7d28e0" exitCode=0 Mar 18 14:26:21 crc kubenswrapper[4756]: I0318 14:26:21.376041 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459","Type":"ContainerDied","Data":"dc58a10256720f85ecc79985639090d2de00d69ed7451735905f7e194e7d28e0"} Mar 18 14:26:21 crc kubenswrapper[4756]: I0318 14:26:21.378451 4756 generic.go:334] "Generic (PLEG): container finished" podID="cd47c362-dbd1-40b7-8f54-b967d5998fcd" containerID="2dac5e6313bd5acf6f685e0a53c4526e61552cb8e569499e8928466e7cc65afc" exitCode=0 Mar 18 14:26:21 crc kubenswrapper[4756]: I0318 14:26:21.378496 4756 generic.go:334] "Generic (PLEG): container finished" podID="cd47c362-dbd1-40b7-8f54-b967d5998fcd" containerID="a6651cfd2b97688c0bf583665883ac7fd8b8ad929778ecd5484a983e72694ca7" exitCode=143 Mar 18 14:26:21 crc kubenswrapper[4756]: I0318 14:26:21.378525 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"cd47c362-dbd1-40b7-8f54-b967d5998fcd","Type":"ContainerDied","Data":"2dac5e6313bd5acf6f685e0a53c4526e61552cb8e569499e8928466e7cc65afc"} Mar 18 14:26:21 crc kubenswrapper[4756]: I0318 14:26:21.378565 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"cd47c362-dbd1-40b7-8f54-b967d5998fcd","Type":"ContainerDied","Data":"a6651cfd2b97688c0bf583665883ac7fd8b8ad929778ecd5484a983e72694ca7"} Mar 18 14:26:21 crc kubenswrapper[4756]: I0318 14:26:21.943911 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.064980 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-scripts\") pod \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.065068 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-combined-ca-bundle\") pod \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.065098 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-public-tls-certs\") pod \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.065162 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cd47c362-dbd1-40b7-8f54-b967d5998fcd-certs\") pod \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.065258 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8sl7\" (UniqueName: \"kubernetes.io/projected/cd47c362-dbd1-40b7-8f54-b967d5998fcd-kube-api-access-j8sl7\") pod \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.065278 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-config-data-custom\") pod \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.065305 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd47c362-dbd1-40b7-8f54-b967d5998fcd-logs\") pod \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.065392 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-config-data\") pod \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.065420 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-internal-tls-certs\") pod \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\" (UID: \"cd47c362-dbd1-40b7-8f54-b967d5998fcd\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.069613 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd47c362-dbd1-40b7-8f54-b967d5998fcd-logs" (OuterVolumeSpecName: "logs") pod "cd47c362-dbd1-40b7-8f54-b967d5998fcd" (UID: "cd47c362-dbd1-40b7-8f54-b967d5998fcd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.074657 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-scripts" (OuterVolumeSpecName: "scripts") pod "cd47c362-dbd1-40b7-8f54-b967d5998fcd" (UID: "cd47c362-dbd1-40b7-8f54-b967d5998fcd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.077243 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cd47c362-dbd1-40b7-8f54-b967d5998fcd" (UID: "cd47c362-dbd1-40b7-8f54-b967d5998fcd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.078787 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd47c362-dbd1-40b7-8f54-b967d5998fcd-kube-api-access-j8sl7" (OuterVolumeSpecName: "kube-api-access-j8sl7") pod "cd47c362-dbd1-40b7-8f54-b967d5998fcd" (UID: "cd47c362-dbd1-40b7-8f54-b967d5998fcd"). InnerVolumeSpecName "kube-api-access-j8sl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.090305 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd47c362-dbd1-40b7-8f54-b967d5998fcd-certs" (OuterVolumeSpecName: "certs") pod "cd47c362-dbd1-40b7-8f54-b967d5998fcd" (UID: "cd47c362-dbd1-40b7-8f54-b967d5998fcd"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.124004 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-config-data" (OuterVolumeSpecName: "config-data") pod "cd47c362-dbd1-40b7-8f54-b967d5998fcd" (UID: "cd47c362-dbd1-40b7-8f54-b967d5998fcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.124302 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd47c362-dbd1-40b7-8f54-b967d5998fcd" (UID: "cd47c362-dbd1-40b7-8f54-b967d5998fcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.143509 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cd47c362-dbd1-40b7-8f54-b967d5998fcd" (UID: "cd47c362-dbd1-40b7-8f54-b967d5998fcd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.157377 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cd47c362-dbd1-40b7-8f54-b967d5998fcd" (UID: "cd47c362-dbd1-40b7-8f54-b967d5998fcd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.168682 4756 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.168937 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.168946 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.168956 4756 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.168965 4756 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cd47c362-dbd1-40b7-8f54-b967d5998fcd-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.168974 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8sl7\" (UniqueName: \"kubernetes.io/projected/cd47c362-dbd1-40b7-8f54-b967d5998fcd-kube-api-access-j8sl7\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.168983 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.168991 4756 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd47c362-dbd1-40b7-8f54-b967d5998fcd-logs\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.168998 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd47c362-dbd1-40b7-8f54-b967d5998fcd-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.288428 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.371261 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-combined-ca-bundle\") pod \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.371344 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-config-data-custom\") pod \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.371399 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-certs\") pod \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.371448 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8krnt\" (UniqueName: \"kubernetes.io/projected/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-kube-api-access-8krnt\") pod \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.371475 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-scripts\") pod \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.371509 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-config-data\") pod \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\" (UID: \"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459\") " Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.375458 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-certs" (OuterVolumeSpecName: "certs") pod "3e634fb0-be69-4f7f-8cfa-1d9a1ef54459" (UID: "3e634fb0-be69-4f7f-8cfa-1d9a1ef54459"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.375737 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3e634fb0-be69-4f7f-8cfa-1d9a1ef54459" (UID: "3e634fb0-be69-4f7f-8cfa-1d9a1ef54459"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.376373 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-kube-api-access-8krnt" (OuterVolumeSpecName: "kube-api-access-8krnt") pod "3e634fb0-be69-4f7f-8cfa-1d9a1ef54459" (UID: "3e634fb0-be69-4f7f-8cfa-1d9a1ef54459"). InnerVolumeSpecName "kube-api-access-8krnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.381231 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-scripts" (OuterVolumeSpecName: "scripts") pod "3e634fb0-be69-4f7f-8cfa-1d9a1ef54459" (UID: "3e634fb0-be69-4f7f-8cfa-1d9a1ef54459"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.393418 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"cd47c362-dbd1-40b7-8f54-b967d5998fcd","Type":"ContainerDied","Data":"bed026cff87645bdc31fc780073c1d7f985818edad8a280b57c9f8b93194e4d3"} Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.393457 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.393461 4756 scope.go:117] "RemoveContainer" containerID="2dac5e6313bd5acf6f685e0a53c4526e61552cb8e569499e8928466e7cc65afc" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.399212 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"3e634fb0-be69-4f7f-8cfa-1d9a1ef54459","Type":"ContainerDied","Data":"984634bbda65dc8e437d31246e224f4d083fdc5ba9a9bf357e0c3f3dbf791e91"} Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.399309 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.403083 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-config-data" (OuterVolumeSpecName: "config-data") pod "3e634fb0-be69-4f7f-8cfa-1d9a1ef54459" (UID: "3e634fb0-be69-4f7f-8cfa-1d9a1ef54459"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.406608 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e634fb0-be69-4f7f-8cfa-1d9a1ef54459" (UID: "3e634fb0-be69-4f7f-8cfa-1d9a1ef54459"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.480021 4756 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.480090 4756 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.480134 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8krnt\" (UniqueName: \"kubernetes.io/projected/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-kube-api-access-8krnt\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.480159 4756 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.480189 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.480209 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.505896 4756 scope.go:117] "RemoveContainer" containerID="a6651cfd2b97688c0bf583665883ac7fd8b8ad929778ecd5484a983e72694ca7" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.522858 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.530587 4756 scope.go:117] "RemoveContainer" containerID="dc58a10256720f85ecc79985639090d2de00d69ed7451735905f7e194e7d28e0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.543989 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.557961 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:26:22 crc kubenswrapper[4756]: E0318 14:26:22.558376 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd47c362-dbd1-40b7-8f54-b967d5998fcd" containerName="cloudkitty-api" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.558395 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd47c362-dbd1-40b7-8f54-b967d5998fcd" containerName="cloudkitty-api" Mar 18 14:26:22 crc kubenswrapper[4756]: E0318 14:26:22.558424 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e634fb0-be69-4f7f-8cfa-1d9a1ef54459" containerName="cloudkitty-proc" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.558431 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e634fb0-be69-4f7f-8cfa-1d9a1ef54459" containerName="cloudkitty-proc" Mar 18 14:26:22 crc kubenswrapper[4756]: E0318 14:26:22.558444 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" containerName="dnsmasq-dns" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.558451 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" containerName="dnsmasq-dns" Mar 18 14:26:22 crc kubenswrapper[4756]: E0318 14:26:22.558460 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" containerName="init" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.558465 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" containerName="init" Mar 18 14:26:22 crc kubenswrapper[4756]: E0318 14:26:22.558483 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd47c362-dbd1-40b7-8f54-b967d5998fcd" containerName="cloudkitty-api-log" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.558489 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd47c362-dbd1-40b7-8f54-b967d5998fcd" containerName="cloudkitty-api-log" Mar 18 14:26:22 crc kubenswrapper[4756]: E0318 14:26:22.558501 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12455cd-3670-4081-8e78-d2088ac075cc" containerName="cloudkitty-storageinit" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.558506 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12455cd-3670-4081-8e78-d2088ac075cc" containerName="cloudkitty-storageinit" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.558686 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e634fb0-be69-4f7f-8cfa-1d9a1ef54459" containerName="cloudkitty-proc" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.558704 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" containerName="dnsmasq-dns" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.558719 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12455cd-3670-4081-8e78-d2088ac075cc" containerName="cloudkitty-storageinit" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.558727 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd47c362-dbd1-40b7-8f54-b967d5998fcd" containerName="cloudkitty-api-log" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.558738 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd47c362-dbd1-40b7-8f54-b967d5998fcd" containerName="cloudkitty-api" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.559814 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.562582 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.562847 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.563024 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.573780 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.581550 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-scripts\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.581666 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.581704 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.581791 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-config-data\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.581863 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m88tf\" (UniqueName: \"kubernetes.io/projected/518ce4af-ca96-475d-9514-f7304f6a2498-kube-api-access-m88tf\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.581932 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/518ce4af-ca96-475d-9514-f7304f6a2498-certs\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.582064 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/518ce4af-ca96-475d-9514-f7304f6a2498-logs\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.582216 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.582331 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.683982 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-scripts\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.684053 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.684088 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.684184 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-config-data\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.684214 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m88tf\" (UniqueName: \"kubernetes.io/projected/518ce4af-ca96-475d-9514-f7304f6a2498-kube-api-access-m88tf\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.684237 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/518ce4af-ca96-475d-9514-f7304f6a2498-certs\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.684263 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/518ce4af-ca96-475d-9514-f7304f6a2498-logs\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.684318 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.684344 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.686191 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/518ce4af-ca96-475d-9514-f7304f6a2498-logs\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.688212 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.689544 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.689733 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.690097 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-scripts\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.690439 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-config-data\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.690620 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/518ce4af-ca96-475d-9514-f7304f6a2498-certs\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.690916 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/518ce4af-ca96-475d-9514-f7304f6a2498-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.705833 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m88tf\" (UniqueName: \"kubernetes.io/projected/518ce4af-ca96-475d-9514-f7304f6a2498-kube-api-access-m88tf\") pod \"cloudkitty-api-0\" (UID: \"518ce4af-ca96-475d-9514-f7304f6a2498\") " pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.740733 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.760478 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.774017 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.775498 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.777827 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.787401 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e468a87-954b-4269-b75f-e8aed6cc63aa-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.787542 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e468a87-954b-4269-b75f-e8aed6cc63aa-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.787594 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8e468a87-954b-4269-b75f-e8aed6cc63aa-certs\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.787664 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e468a87-954b-4269-b75f-e8aed6cc63aa-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.787704 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e468a87-954b-4269-b75f-e8aed6cc63aa-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.787746 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n87d2\" (UniqueName: \"kubernetes.io/projected/8e468a87-954b-4269-b75f-e8aed6cc63aa-kube-api-access-n87d2\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.788622 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.876294 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.889078 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n87d2\" (UniqueName: \"kubernetes.io/projected/8e468a87-954b-4269-b75f-e8aed6cc63aa-kube-api-access-n87d2\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.889214 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e468a87-954b-4269-b75f-e8aed6cc63aa-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.889291 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e468a87-954b-4269-b75f-e8aed6cc63aa-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.889355 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8e468a87-954b-4269-b75f-e8aed6cc63aa-certs\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.889415 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e468a87-954b-4269-b75f-e8aed6cc63aa-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.889461 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e468a87-954b-4269-b75f-e8aed6cc63aa-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.894205 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e468a87-954b-4269-b75f-e8aed6cc63aa-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.894377 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e468a87-954b-4269-b75f-e8aed6cc63aa-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.895176 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e468a87-954b-4269-b75f-e8aed6cc63aa-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.895250 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e468a87-954b-4269-b75f-e8aed6cc63aa-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.895382 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8e468a87-954b-4269-b75f-e8aed6cc63aa-certs\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:22 crc kubenswrapper[4756]: I0318 14:26:22.905505 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n87d2\" (UniqueName: \"kubernetes.io/projected/8e468a87-954b-4269-b75f-e8aed6cc63aa-kube-api-access-n87d2\") pod \"cloudkitty-proc-0\" (UID: \"8e468a87-954b-4269-b75f-e8aed6cc63aa\") " pod="openstack/cloudkitty-proc-0" Mar 18 14:26:23 crc kubenswrapper[4756]: I0318 14:26:23.097148 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 18 14:26:23 crc kubenswrapper[4756]: I0318 14:26:23.329515 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e634fb0-be69-4f7f-8cfa-1d9a1ef54459" path="/var/lib/kubelet/pods/3e634fb0-be69-4f7f-8cfa-1d9a1ef54459/volumes" Mar 18 14:26:23 crc kubenswrapper[4756]: I0318 14:26:23.330395 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd47c362-dbd1-40b7-8f54-b967d5998fcd" path="/var/lib/kubelet/pods/cd47c362-dbd1-40b7-8f54-b967d5998fcd/volumes" Mar 18 14:26:23 crc kubenswrapper[4756]: I0318 14:26:23.341608 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 18 14:26:23 crc kubenswrapper[4756]: W0318 14:26:23.347249 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod518ce4af_ca96_475d_9514_f7304f6a2498.slice/crio-b62e252050a4966c364eb3b91c8a362e2f048a6768b6acd03ac2605ea58adb92 WatchSource:0}: Error finding container b62e252050a4966c364eb3b91c8a362e2f048a6768b6acd03ac2605ea58adb92: Status 404 returned error can't find the container with id b62e252050a4966c364eb3b91c8a362e2f048a6768b6acd03ac2605ea58adb92 Mar 18 14:26:23 crc kubenswrapper[4756]: I0318 14:26:23.415471 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"518ce4af-ca96-475d-9514-f7304f6a2498","Type":"ContainerStarted","Data":"b62e252050a4966c364eb3b91c8a362e2f048a6768b6acd03ac2605ea58adb92"} Mar 18 14:26:23 crc kubenswrapper[4756]: I0318 14:26:23.566181 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 18 14:26:24 crc kubenswrapper[4756]: I0318 14:26:24.433453 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"518ce4af-ca96-475d-9514-f7304f6a2498","Type":"ContainerStarted","Data":"1888eaec1bcb673e70b1ea7f054ed050e95f8307ca8e4e50923f01126fb7279d"} Mar 18 14:26:24 crc kubenswrapper[4756]: I0318 14:26:24.433956 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Mar 18 14:26:24 crc kubenswrapper[4756]: I0318 14:26:24.433970 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"518ce4af-ca96-475d-9514-f7304f6a2498","Type":"ContainerStarted","Data":"9379c7559bbf7fb44eedd920b3791cd82ccf5f23964dbb5a47c3b19ed8c0d9c6"} Mar 18 14:26:24 crc kubenswrapper[4756]: I0318 14:26:24.439246 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8e468a87-954b-4269-b75f-e8aed6cc63aa","Type":"ContainerStarted","Data":"0bd8a6cdffd58acd8c86872965fb4fd8ce5ae9793024fb4646c32c70904b3ff5"} Mar 18 14:26:24 crc kubenswrapper[4756]: I0318 14:26:24.460427 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.460408165 podStartE2EDuration="2.460408165s" podCreationTimestamp="2026-03-18 14:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:26:24.459019197 +0000 UTC m=+1585.773437192" watchObservedRunningTime="2026-03-18 14:26:24.460408165 +0000 UTC m=+1585.774826150" Mar 18 14:26:25 crc kubenswrapper[4756]: I0318 14:26:25.452859 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8e468a87-954b-4269-b75f-e8aed6cc63aa","Type":"ContainerStarted","Data":"1e4c71058547d7aeab1df2e6f751a20f268df068794471fe7294ef20faf135e2"} Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.578657 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=4.7251199 podStartE2EDuration="5.578633128s" podCreationTimestamp="2026-03-18 14:26:22 +0000 UTC" firstStartedPulling="2026-03-18 14:26:23.572023244 +0000 UTC m=+1584.886441229" lastFinishedPulling="2026-03-18 14:26:24.425536482 +0000 UTC m=+1585.739954457" observedRunningTime="2026-03-18 14:26:25.482359738 +0000 UTC m=+1586.796777723" watchObservedRunningTime="2026-03-18 14:26:27.578633128 +0000 UTC m=+1588.893051113" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.582330 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg"] Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.584694 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.591604 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.591738 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.592237 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.592242 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.621617 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg"] Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.708267 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.708387 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.708470 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87674\" (UniqueName: \"kubernetes.io/projected/fc4e996a-8348-462e-87dd-552f33102a82-kube-api-access-87674\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.708610 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.810655 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.810719 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.810802 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.810880 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87674\" (UniqueName: \"kubernetes.io/projected/fc4e996a-8348-462e-87dd-552f33102a82-kube-api-access-87674\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.816993 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.817317 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.817698 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.835464 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87674\" (UniqueName: \"kubernetes.io/projected/fc4e996a-8348-462e-87dd-552f33102a82-kube-api-access-87674\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:27 crc kubenswrapper[4756]: I0318 14:26:27.906540 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:28 crc kubenswrapper[4756]: I0318 14:26:28.481646 4756 generic.go:334] "Generic (PLEG): container finished" podID="cc1f1584-6d11-4821-8a1d-4a58648313e3" containerID="10994d016fdf578c85c6dbdb894ffc6b58c32919151e2af5c9880cd91fd14752" exitCode=0 Mar 18 14:26:28 crc kubenswrapper[4756]: I0318 14:26:28.481864 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc1f1584-6d11-4821-8a1d-4a58648313e3","Type":"ContainerDied","Data":"10994d016fdf578c85c6dbdb894ffc6b58c32919151e2af5c9880cd91fd14752"} Mar 18 14:26:28 crc kubenswrapper[4756]: I0318 14:26:28.515508 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg"] Mar 18 14:26:28 crc kubenswrapper[4756]: I0318 14:26:28.532083 4756 scope.go:117] "RemoveContainer" containerID="71080e2f7a923687a82b980e51f4a20a6b2641243d30bff3a1c4f9763a551a86" Mar 18 14:26:28 crc kubenswrapper[4756]: I0318 14:26:28.635200 4756 scope.go:117] "RemoveContainer" containerID="60d7ce43d6b4d49750a3f726650b715ceb1c8c79673e18a2165fd579133993d9" Mar 18 14:26:28 crc kubenswrapper[4756]: I0318 14:26:28.712574 4756 scope.go:117] "RemoveContainer" containerID="9e1cdfeacb5d741083531a9b44627ebef31f79c1b82b6e7e2e4feb1175a89cf9" Mar 18 14:26:28 crc kubenswrapper[4756]: I0318 14:26:28.742331 4756 scope.go:117] "RemoveContainer" containerID="74fb192257f4a49010e4b73ce8424a4b1a99661af99e75c492cc27e35a2d77ee" Mar 18 14:26:29 crc kubenswrapper[4756]: I0318 14:26:29.494419 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc1f1584-6d11-4821-8a1d-4a58648313e3","Type":"ContainerStarted","Data":"1a69ed6993f74103cf8d8509fdb75df578144143383de6503b2ce3985f79f176"} Mar 18 14:26:29 crc kubenswrapper[4756]: I0318 14:26:29.494739 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 14:26:29 crc kubenswrapper[4756]: I0318 14:26:29.496188 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" event={"ID":"fc4e996a-8348-462e-87dd-552f33102a82","Type":"ContainerStarted","Data":"f2f2491628c183c56a70f90861638cc71ad72f44b14a765eb207b7b568f4069e"} Mar 18 14:26:29 crc kubenswrapper[4756]: I0318 14:26:29.526549 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.526532327 podStartE2EDuration="36.526532327s" podCreationTimestamp="2026-03-18 14:25:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:26:29.516328112 +0000 UTC m=+1590.830746087" watchObservedRunningTime="2026-03-18 14:26:29.526532327 +0000 UTC m=+1590.840950302" Mar 18 14:26:32 crc kubenswrapper[4756]: I0318 14:26:32.544807 4756 generic.go:334] "Generic (PLEG): container finished" podID="c6dd5f14-94cd-4fee-9798-8c93d27de8b9" containerID="34910de5f6986d8529ca30a27f18c41e30e99856baa499ec5f81eb2ee529b25d" exitCode=0 Mar 18 14:26:32 crc kubenswrapper[4756]: I0318 14:26:32.544891 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c6dd5f14-94cd-4fee-9798-8c93d27de8b9","Type":"ContainerDied","Data":"34910de5f6986d8529ca30a27f18c41e30e99856baa499ec5f81eb2ee529b25d"} Mar 18 14:26:37 crc kubenswrapper[4756]: I0318 14:26:37.604492 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" event={"ID":"fc4e996a-8348-462e-87dd-552f33102a82","Type":"ContainerStarted","Data":"4782d3d54ae999fbfd1262d8c2d09054657bd5d45d5f1643673e12fef5a4f6be"} Mar 18 14:26:37 crc kubenswrapper[4756]: I0318 14:26:37.610866 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c6dd5f14-94cd-4fee-9798-8c93d27de8b9","Type":"ContainerStarted","Data":"07329066659a3b24cd83aad358193d19446ecfc3e8cd0834974c793f3dfb2a1b"} Mar 18 14:26:37 crc kubenswrapper[4756]: I0318 14:26:37.612043 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:26:37 crc kubenswrapper[4756]: I0318 14:26:37.643637 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" podStartSLOduration=2.236502636 podStartE2EDuration="10.643602775s" podCreationTimestamp="2026-03-18 14:26:27 +0000 UTC" firstStartedPulling="2026-03-18 14:26:28.5173356 +0000 UTC m=+1589.831753575" lastFinishedPulling="2026-03-18 14:26:36.924435739 +0000 UTC m=+1598.238853714" observedRunningTime="2026-03-18 14:26:37.62935823 +0000 UTC m=+1598.943776205" watchObservedRunningTime="2026-03-18 14:26:37.643602775 +0000 UTC m=+1598.958020790" Mar 18 14:26:37 crc kubenswrapper[4756]: I0318 14:26:37.661251 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.661226622 podStartE2EDuration="43.661226622s" podCreationTimestamp="2026-03-18 14:25:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:26:37.652521806 +0000 UTC m=+1598.966939821" watchObservedRunningTime="2026-03-18 14:26:37.661226622 +0000 UTC m=+1598.975644637" Mar 18 14:26:40 crc kubenswrapper[4756]: I0318 14:26:40.129876 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 14:26:43 crc kubenswrapper[4756]: I0318 14:26:43.905343 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 14:26:47 crc kubenswrapper[4756]: I0318 14:26:47.392186 4756 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod4c9a9f66-26c1-4e32-9a6e-d74d63e5be53"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod4c9a9f66-26c1-4e32-9a6e-d74d63e5be53] : Timed out while waiting for systemd to remove kubepods-besteffort-pod4c9a9f66_26c1_4e32_9a6e_d74d63e5be53.slice" Mar 18 14:26:47 crc kubenswrapper[4756]: E0318 14:26:47.392545 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod4c9a9f66-26c1-4e32-9a6e-d74d63e5be53] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod4c9a9f66-26c1-4e32-9a6e-d74d63e5be53] : Timed out while waiting for systemd to remove kubepods-besteffort-pod4c9a9f66_26c1_4e32_9a6e_d74d63e5be53.slice" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" podUID="4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" Mar 18 14:26:47 crc kubenswrapper[4756]: I0318 14:26:47.720221 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-6x4hm" Mar 18 14:26:47 crc kubenswrapper[4756]: I0318 14:26:47.763818 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-6x4hm"] Mar 18 14:26:47 crc kubenswrapper[4756]: I0318 14:26:47.773022 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-6x4hm"] Mar 18 14:26:48 crc kubenswrapper[4756]: I0318 14:26:48.734670 4756 generic.go:334] "Generic (PLEG): container finished" podID="fc4e996a-8348-462e-87dd-552f33102a82" containerID="4782d3d54ae999fbfd1262d8c2d09054657bd5d45d5f1643673e12fef5a4f6be" exitCode=0 Mar 18 14:26:48 crc kubenswrapper[4756]: I0318 14:26:48.734734 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" event={"ID":"fc4e996a-8348-462e-87dd-552f33102a82","Type":"ContainerDied","Data":"4782d3d54ae999fbfd1262d8c2d09054657bd5d45d5f1643673e12fef5a4f6be"} Mar 18 14:26:49 crc kubenswrapper[4756]: I0318 14:26:49.329270 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c9a9f66-26c1-4e32-9a6e-d74d63e5be53" path="/var/lib/kubelet/pods/4c9a9f66-26c1-4e32-9a6e-d74d63e5be53/volumes" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.354590 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.407618 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-repo-setup-combined-ca-bundle\") pod \"fc4e996a-8348-462e-87dd-552f33102a82\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.407710 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-ssh-key-openstack-edpm-ipam\") pod \"fc4e996a-8348-462e-87dd-552f33102a82\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.407898 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-inventory\") pod \"fc4e996a-8348-462e-87dd-552f33102a82\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.407942 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87674\" (UniqueName: \"kubernetes.io/projected/fc4e996a-8348-462e-87dd-552f33102a82-kube-api-access-87674\") pod \"fc4e996a-8348-462e-87dd-552f33102a82\" (UID: \"fc4e996a-8348-462e-87dd-552f33102a82\") " Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.439239 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "fc4e996a-8348-462e-87dd-552f33102a82" (UID: "fc4e996a-8348-462e-87dd-552f33102a82"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.453332 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc4e996a-8348-462e-87dd-552f33102a82-kube-api-access-87674" (OuterVolumeSpecName: "kube-api-access-87674") pod "fc4e996a-8348-462e-87dd-552f33102a82" (UID: "fc4e996a-8348-462e-87dd-552f33102a82"). InnerVolumeSpecName "kube-api-access-87674". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.487211 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-inventory" (OuterVolumeSpecName: "inventory") pod "fc4e996a-8348-462e-87dd-552f33102a82" (UID: "fc4e996a-8348-462e-87dd-552f33102a82"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.489691 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fc4e996a-8348-462e-87dd-552f33102a82" (UID: "fc4e996a-8348-462e-87dd-552f33102a82"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.510091 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.510142 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87674\" (UniqueName: \"kubernetes.io/projected/fc4e996a-8348-462e-87dd-552f33102a82-kube-api-access-87674\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.510154 4756 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.510163 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc4e996a-8348-462e-87dd-552f33102a82-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.758559 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" event={"ID":"fc4e996a-8348-462e-87dd-552f33102a82","Type":"ContainerDied","Data":"f2f2491628c183c56a70f90861638cc71ad72f44b14a765eb207b7b568f4069e"} Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.758609 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2f2491628c183c56a70f90861638cc71ad72f44b14a765eb207b7b568f4069e" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.758631 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.841925 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb"] Mar 18 14:26:50 crc kubenswrapper[4756]: E0318 14:26:50.842414 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4e996a-8348-462e-87dd-552f33102a82" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.842435 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4e996a-8348-462e-87dd-552f33102a82" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.842639 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc4e996a-8348-462e-87dd-552f33102a82" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.844329 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.846590 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.846666 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.848436 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.851279 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.866407 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb"] Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.916921 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prk4t\" (UniqueName: \"kubernetes.io/projected/2fa72d83-fcee-4f4a-8105-15921e405491-kube-api-access-prk4t\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5m7kb\" (UID: \"2fa72d83-fcee-4f4a-8105-15921e405491\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.916995 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fa72d83-fcee-4f4a-8105-15921e405491-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5m7kb\" (UID: \"2fa72d83-fcee-4f4a-8105-15921e405491\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" Mar 18 14:26:50 crc kubenswrapper[4756]: I0318 14:26:50.917434 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fa72d83-fcee-4f4a-8105-15921e405491-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5m7kb\" (UID: \"2fa72d83-fcee-4f4a-8105-15921e405491\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" Mar 18 14:26:51 crc kubenswrapper[4756]: I0318 14:26:51.019190 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prk4t\" (UniqueName: \"kubernetes.io/projected/2fa72d83-fcee-4f4a-8105-15921e405491-kube-api-access-prk4t\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5m7kb\" (UID: \"2fa72d83-fcee-4f4a-8105-15921e405491\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" Mar 18 14:26:51 crc kubenswrapper[4756]: I0318 14:26:51.019564 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fa72d83-fcee-4f4a-8105-15921e405491-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5m7kb\" (UID: \"2fa72d83-fcee-4f4a-8105-15921e405491\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" Mar 18 14:26:51 crc kubenswrapper[4756]: I0318 14:26:51.020413 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fa72d83-fcee-4f4a-8105-15921e405491-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5m7kb\" (UID: \"2fa72d83-fcee-4f4a-8105-15921e405491\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" Mar 18 14:26:51 crc kubenswrapper[4756]: I0318 14:26:51.025870 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fa72d83-fcee-4f4a-8105-15921e405491-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5m7kb\" (UID: \"2fa72d83-fcee-4f4a-8105-15921e405491\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" Mar 18 14:26:51 crc kubenswrapper[4756]: I0318 14:26:51.026251 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fa72d83-fcee-4f4a-8105-15921e405491-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5m7kb\" (UID: \"2fa72d83-fcee-4f4a-8105-15921e405491\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" Mar 18 14:26:51 crc kubenswrapper[4756]: I0318 14:26:51.037802 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prk4t\" (UniqueName: \"kubernetes.io/projected/2fa72d83-fcee-4f4a-8105-15921e405491-kube-api-access-prk4t\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-5m7kb\" (UID: \"2fa72d83-fcee-4f4a-8105-15921e405491\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" Mar 18 14:26:51 crc kubenswrapper[4756]: I0318 14:26:51.160060 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" Mar 18 14:26:51 crc kubenswrapper[4756]: I0318 14:26:51.757150 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb"] Mar 18 14:26:51 crc kubenswrapper[4756]: I0318 14:26:51.779077 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" event={"ID":"2fa72d83-fcee-4f4a-8105-15921e405491","Type":"ContainerStarted","Data":"6f745cc73fd9e027f141de3b282d5408777393301973d6693f5bcf3c169f2e9a"} Mar 18 14:26:52 crc kubenswrapper[4756]: I0318 14:26:52.790414 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" event={"ID":"2fa72d83-fcee-4f4a-8105-15921e405491","Type":"ContainerStarted","Data":"7c353bc96d5d1c2ff79d614f226f42c7fa76324089ed510cd62b2bccf997b9d2"} Mar 18 14:26:52 crc kubenswrapper[4756]: I0318 14:26:52.815631 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" podStartSLOduration=2.112753287 podStartE2EDuration="2.815614182s" podCreationTimestamp="2026-03-18 14:26:50 +0000 UTC" firstStartedPulling="2026-03-18 14:26:51.768563971 +0000 UTC m=+1613.082981946" lastFinishedPulling="2026-03-18 14:26:52.471424866 +0000 UTC m=+1613.785842841" observedRunningTime="2026-03-18 14:26:52.80664916 +0000 UTC m=+1614.121067135" watchObservedRunningTime="2026-03-18 14:26:52.815614182 +0000 UTC m=+1614.130032157" Mar 18 14:26:55 crc kubenswrapper[4756]: I0318 14:26:55.359970 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 14:26:55 crc kubenswrapper[4756]: I0318 14:26:55.815452 4756 generic.go:334] "Generic (PLEG): container finished" podID="2fa72d83-fcee-4f4a-8105-15921e405491" containerID="7c353bc96d5d1c2ff79d614f226f42c7fa76324089ed510cd62b2bccf997b9d2" exitCode=0 Mar 18 14:26:55 crc kubenswrapper[4756]: I0318 14:26:55.815494 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" event={"ID":"2fa72d83-fcee-4f4a-8105-15921e405491","Type":"ContainerDied","Data":"7c353bc96d5d1c2ff79d614f226f42c7fa76324089ed510cd62b2bccf997b9d2"} Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.521235 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.672139 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fa72d83-fcee-4f4a-8105-15921e405491-inventory\") pod \"2fa72d83-fcee-4f4a-8105-15921e405491\" (UID: \"2fa72d83-fcee-4f4a-8105-15921e405491\") " Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.672212 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fa72d83-fcee-4f4a-8105-15921e405491-ssh-key-openstack-edpm-ipam\") pod \"2fa72d83-fcee-4f4a-8105-15921e405491\" (UID: \"2fa72d83-fcee-4f4a-8105-15921e405491\") " Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.672306 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prk4t\" (UniqueName: \"kubernetes.io/projected/2fa72d83-fcee-4f4a-8105-15921e405491-kube-api-access-prk4t\") pod \"2fa72d83-fcee-4f4a-8105-15921e405491\" (UID: \"2fa72d83-fcee-4f4a-8105-15921e405491\") " Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.677155 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fa72d83-fcee-4f4a-8105-15921e405491-kube-api-access-prk4t" (OuterVolumeSpecName: "kube-api-access-prk4t") pod "2fa72d83-fcee-4f4a-8105-15921e405491" (UID: "2fa72d83-fcee-4f4a-8105-15921e405491"). InnerVolumeSpecName "kube-api-access-prk4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.710583 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fa72d83-fcee-4f4a-8105-15921e405491-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2fa72d83-fcee-4f4a-8105-15921e405491" (UID: "2fa72d83-fcee-4f4a-8105-15921e405491"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.738820 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fa72d83-fcee-4f4a-8105-15921e405491-inventory" (OuterVolumeSpecName: "inventory") pod "2fa72d83-fcee-4f4a-8105-15921e405491" (UID: "2fa72d83-fcee-4f4a-8105-15921e405491"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.774517 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fa72d83-fcee-4f4a-8105-15921e405491-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.774556 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fa72d83-fcee-4f4a-8105-15921e405491-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.774568 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prk4t\" (UniqueName: \"kubernetes.io/projected/2fa72d83-fcee-4f4a-8105-15921e405491-kube-api-access-prk4t\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.835392 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" event={"ID":"2fa72d83-fcee-4f4a-8105-15921e405491","Type":"ContainerDied","Data":"6f745cc73fd9e027f141de3b282d5408777393301973d6693f5bcf3c169f2e9a"} Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.835432 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f745cc73fd9e027f141de3b282d5408777393301973d6693f5bcf3c169f2e9a" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.835454 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-5m7kb" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.895791 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq"] Mar 18 14:26:57 crc kubenswrapper[4756]: E0318 14:26:57.896233 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa72d83-fcee-4f4a-8105-15921e405491" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.896250 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa72d83-fcee-4f4a-8105-15921e405491" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.896444 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fa72d83-fcee-4f4a-8105-15921e405491" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.897151 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.899712 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.899732 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.899840 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.900604 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.911755 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq"] Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.977802 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.978232 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.978337 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfqjv\" (UniqueName: \"kubernetes.io/projected/06faf283-4cbe-459f-81b6-ca3f598ae5b0-kube-api-access-bfqjv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:57 crc kubenswrapper[4756]: I0318 14:26:57.978422 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:58 crc kubenswrapper[4756]: I0318 14:26:58.079895 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:58 crc kubenswrapper[4756]: I0318 14:26:58.080258 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:58 crc kubenswrapper[4756]: I0318 14:26:58.080359 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfqjv\" (UniqueName: \"kubernetes.io/projected/06faf283-4cbe-459f-81b6-ca3f598ae5b0-kube-api-access-bfqjv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:58 crc kubenswrapper[4756]: I0318 14:26:58.080699 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:58 crc kubenswrapper[4756]: I0318 14:26:58.084551 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:58 crc kubenswrapper[4756]: I0318 14:26:58.085759 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:58 crc kubenswrapper[4756]: I0318 14:26:58.089039 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:58 crc kubenswrapper[4756]: I0318 14:26:58.114349 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfqjv\" (UniqueName: \"kubernetes.io/projected/06faf283-4cbe-459f-81b6-ca3f598ae5b0-kube-api-access-bfqjv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:58 crc kubenswrapper[4756]: I0318 14:26:58.219412 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:26:58 crc kubenswrapper[4756]: I0318 14:26:58.765091 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq"] Mar 18 14:26:58 crc kubenswrapper[4756]: W0318 14:26:58.772752 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06faf283_4cbe_459f_81b6_ca3f598ae5b0.slice/crio-47952848748ae6d28ac9a7b8524a3ca351974f31b9d95c6e58d622e1ed3ddbae WatchSource:0}: Error finding container 47952848748ae6d28ac9a7b8524a3ca351974f31b9d95c6e58d622e1ed3ddbae: Status 404 returned error can't find the container with id 47952848748ae6d28ac9a7b8524a3ca351974f31b9d95c6e58d622e1ed3ddbae Mar 18 14:26:58 crc kubenswrapper[4756]: I0318 14:26:58.851599 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" event={"ID":"06faf283-4cbe-459f-81b6-ca3f598ae5b0","Type":"ContainerStarted","Data":"47952848748ae6d28ac9a7b8524a3ca351974f31b9d95c6e58d622e1ed3ddbae"} Mar 18 14:26:59 crc kubenswrapper[4756]: I0318 14:26:59.532044 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:26:59 crc kubenswrapper[4756]: I0318 14:26:59.769739 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Mar 18 14:26:59 crc kubenswrapper[4756]: I0318 14:26:59.860724 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" event={"ID":"06faf283-4cbe-459f-81b6-ca3f598ae5b0","Type":"ContainerStarted","Data":"00ef45778e1976480d20af21399cc2e858e5844bda55a7a09a62c05d06b98e13"} Mar 18 14:26:59 crc kubenswrapper[4756]: I0318 14:26:59.886782 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" podStartSLOduration=2.132981187 podStartE2EDuration="2.886760689s" podCreationTimestamp="2026-03-18 14:26:57 +0000 UTC" firstStartedPulling="2026-03-18 14:26:58.775379778 +0000 UTC m=+1620.089797753" lastFinishedPulling="2026-03-18 14:26:59.52915928 +0000 UTC m=+1620.843577255" observedRunningTime="2026-03-18 14:26:59.875541576 +0000 UTC m=+1621.189959551" watchObservedRunningTime="2026-03-18 14:26:59.886760689 +0000 UTC m=+1621.201178664" Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.146570 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q6vh6"] Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.150192 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.161707 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6vh6"] Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.303490 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzjg9\" (UniqueName: \"kubernetes.io/projected/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-kube-api-access-kzjg9\") pod \"redhat-marketplace-q6vh6\" (UID: \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\") " pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.303655 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-catalog-content\") pod \"redhat-marketplace-q6vh6\" (UID: \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\") " pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.303700 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-utilities\") pod \"redhat-marketplace-q6vh6\" (UID: \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\") " pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.405599 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-catalog-content\") pod \"redhat-marketplace-q6vh6\" (UID: \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\") " pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.405653 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-utilities\") pod \"redhat-marketplace-q6vh6\" (UID: \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\") " pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.405782 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzjg9\" (UniqueName: \"kubernetes.io/projected/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-kube-api-access-kzjg9\") pod \"redhat-marketplace-q6vh6\" (UID: \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\") " pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.406280 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-catalog-content\") pod \"redhat-marketplace-q6vh6\" (UID: \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\") " pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.406297 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-utilities\") pod \"redhat-marketplace-q6vh6\" (UID: \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\") " pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.424645 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzjg9\" (UniqueName: \"kubernetes.io/projected/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-kube-api-access-kzjg9\") pod \"redhat-marketplace-q6vh6\" (UID: \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\") " pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.472992 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:17 crc kubenswrapper[4756]: I0318 14:27:17.974899 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6vh6"] Mar 18 14:27:17 crc kubenswrapper[4756]: W0318 14:27:17.976788 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod847f8e82_73d7_4adf_8dfd_fadc6305cdf1.slice/crio-3df5bd859226ce08a9f82449af62fff51afe77fb1a934aedbf0db98deefeb746 WatchSource:0}: Error finding container 3df5bd859226ce08a9f82449af62fff51afe77fb1a934aedbf0db98deefeb746: Status 404 returned error can't find the container with id 3df5bd859226ce08a9f82449af62fff51afe77fb1a934aedbf0db98deefeb746 Mar 18 14:27:18 crc kubenswrapper[4756]: I0318 14:27:18.071353 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6vh6" event={"ID":"847f8e82-73d7-4adf-8dfd-fadc6305cdf1","Type":"ContainerStarted","Data":"3df5bd859226ce08a9f82449af62fff51afe77fb1a934aedbf0db98deefeb746"} Mar 18 14:27:19 crc kubenswrapper[4756]: I0318 14:27:19.090717 4756 generic.go:334] "Generic (PLEG): container finished" podID="847f8e82-73d7-4adf-8dfd-fadc6305cdf1" containerID="6c414f8f1f13dd4bd115061d92c7d1ea708da3c2738fb6694059b60344da07e9" exitCode=0 Mar 18 14:27:19 crc kubenswrapper[4756]: I0318 14:27:19.090831 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6vh6" event={"ID":"847f8e82-73d7-4adf-8dfd-fadc6305cdf1","Type":"ContainerDied","Data":"6c414f8f1f13dd4bd115061d92c7d1ea708da3c2738fb6694059b60344da07e9"} Mar 18 14:27:22 crc kubenswrapper[4756]: I0318 14:27:22.119635 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6vh6" event={"ID":"847f8e82-73d7-4adf-8dfd-fadc6305cdf1","Type":"ContainerStarted","Data":"8a17dbf6f1a3b2b7b42d46a9bfef44c0ab8c62f561cdf8f584f88a8cabaa7da1"} Mar 18 14:27:25 crc kubenswrapper[4756]: I0318 14:27:25.153029 4756 generic.go:334] "Generic (PLEG): container finished" podID="847f8e82-73d7-4adf-8dfd-fadc6305cdf1" containerID="8a17dbf6f1a3b2b7b42d46a9bfef44c0ab8c62f561cdf8f584f88a8cabaa7da1" exitCode=0 Mar 18 14:27:25 crc kubenswrapper[4756]: I0318 14:27:25.153056 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6vh6" event={"ID":"847f8e82-73d7-4adf-8dfd-fadc6305cdf1","Type":"ContainerDied","Data":"8a17dbf6f1a3b2b7b42d46a9bfef44c0ab8c62f561cdf8f584f88a8cabaa7da1"} Mar 18 14:27:26 crc kubenswrapper[4756]: I0318 14:27:26.164281 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6vh6" event={"ID":"847f8e82-73d7-4adf-8dfd-fadc6305cdf1","Type":"ContainerStarted","Data":"25edfd8fc19ae45b2c85ed1debb5336e3f48f9b9c0df548b59ca26cb782ab942"} Mar 18 14:27:26 crc kubenswrapper[4756]: I0318 14:27:26.179518 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q6vh6" podStartSLOduration=2.732388868 podStartE2EDuration="9.179499001s" podCreationTimestamp="2026-03-18 14:27:17 +0000 UTC" firstStartedPulling="2026-03-18 14:27:19.092756142 +0000 UTC m=+1640.407174117" lastFinishedPulling="2026-03-18 14:27:25.539866275 +0000 UTC m=+1646.854284250" observedRunningTime="2026-03-18 14:27:26.178778641 +0000 UTC m=+1647.493196616" watchObservedRunningTime="2026-03-18 14:27:26.179499001 +0000 UTC m=+1647.493916996" Mar 18 14:27:27 crc kubenswrapper[4756]: I0318 14:27:27.474179 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:27 crc kubenswrapper[4756]: I0318 14:27:27.474231 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:28 crc kubenswrapper[4756]: I0318 14:27:28.544216 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-q6vh6" podUID="847f8e82-73d7-4adf-8dfd-fadc6305cdf1" containerName="registry-server" probeResult="failure" output=< Mar 18 14:27:28 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:27:28 crc kubenswrapper[4756]: > Mar 18 14:27:28 crc kubenswrapper[4756]: I0318 14:27:28.956450 4756 scope.go:117] "RemoveContainer" containerID="7c17f87498ec948bd713efcefc461bae70853e835995551fae83a4dd7fe7ecd1" Mar 18 14:27:29 crc kubenswrapper[4756]: I0318 14:27:29.401163 4756 scope.go:117] "RemoveContainer" containerID="8b46fe93853f47e87b4dc6bad8803682f8e507f94a42ad20dfe1f0850bf6adf1" Mar 18 14:27:29 crc kubenswrapper[4756]: I0318 14:27:29.636587 4756 scope.go:117] "RemoveContainer" containerID="df35257fe5b149f9fdb55e31ca98d4f6848cea9f144339254af9a623836e2d60" Mar 18 14:27:29 crc kubenswrapper[4756]: I0318 14:27:29.693612 4756 scope.go:117] "RemoveContainer" containerID="7e7793b6e1ef38a798923bc73c219a940941c5e9089015eda346a7411fa3fae1" Mar 18 14:27:37 crc kubenswrapper[4756]: I0318 14:27:37.534311 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:37 crc kubenswrapper[4756]: I0318 14:27:37.588500 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:37 crc kubenswrapper[4756]: I0318 14:27:37.773508 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6vh6"] Mar 18 14:27:39 crc kubenswrapper[4756]: I0318 14:27:39.295470 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q6vh6" podUID="847f8e82-73d7-4adf-8dfd-fadc6305cdf1" containerName="registry-server" containerID="cri-o://25edfd8fc19ae45b2c85ed1debb5336e3f48f9b9c0df548b59ca26cb782ab942" gracePeriod=2 Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.163566 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.211793 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzjg9\" (UniqueName: \"kubernetes.io/projected/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-kube-api-access-kzjg9\") pod \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\" (UID: \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\") " Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.211891 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-utilities\") pod \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\" (UID: \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\") " Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.212244 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-catalog-content\") pod \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\" (UID: \"847f8e82-73d7-4adf-8dfd-fadc6305cdf1\") " Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.213019 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-utilities" (OuterVolumeSpecName: "utilities") pod "847f8e82-73d7-4adf-8dfd-fadc6305cdf1" (UID: "847f8e82-73d7-4adf-8dfd-fadc6305cdf1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.221263 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-kube-api-access-kzjg9" (OuterVolumeSpecName: "kube-api-access-kzjg9") pod "847f8e82-73d7-4adf-8dfd-fadc6305cdf1" (UID: "847f8e82-73d7-4adf-8dfd-fadc6305cdf1"). InnerVolumeSpecName "kube-api-access-kzjg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.246697 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "847f8e82-73d7-4adf-8dfd-fadc6305cdf1" (UID: "847f8e82-73d7-4adf-8dfd-fadc6305cdf1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.312839 4756 generic.go:334] "Generic (PLEG): container finished" podID="847f8e82-73d7-4adf-8dfd-fadc6305cdf1" containerID="25edfd8fc19ae45b2c85ed1debb5336e3f48f9b9c0df548b59ca26cb782ab942" exitCode=0 Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.312880 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6vh6" event={"ID":"847f8e82-73d7-4adf-8dfd-fadc6305cdf1","Type":"ContainerDied","Data":"25edfd8fc19ae45b2c85ed1debb5336e3f48f9b9c0df548b59ca26cb782ab942"} Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.312909 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6vh6" event={"ID":"847f8e82-73d7-4adf-8dfd-fadc6305cdf1","Type":"ContainerDied","Data":"3df5bd859226ce08a9f82449af62fff51afe77fb1a934aedbf0db98deefeb746"} Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.312908 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6vh6" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.312934 4756 scope.go:117] "RemoveContainer" containerID="25edfd8fc19ae45b2c85ed1debb5336e3f48f9b9c0df548b59ca26cb782ab942" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.314100 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.314145 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzjg9\" (UniqueName: \"kubernetes.io/projected/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-kube-api-access-kzjg9\") on node \"crc\" DevicePath \"\"" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.314157 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847f8e82-73d7-4adf-8dfd-fadc6305cdf1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.339719 4756 scope.go:117] "RemoveContainer" containerID="8a17dbf6f1a3b2b7b42d46a9bfef44c0ab8c62f561cdf8f584f88a8cabaa7da1" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.367853 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6vh6"] Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.381346 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6vh6"] Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.383258 4756 scope.go:117] "RemoveContainer" containerID="6c414f8f1f13dd4bd115061d92c7d1ea708da3c2738fb6694059b60344da07e9" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.439596 4756 scope.go:117] "RemoveContainer" containerID="25edfd8fc19ae45b2c85ed1debb5336e3f48f9b9c0df548b59ca26cb782ab942" Mar 18 14:27:40 crc kubenswrapper[4756]: E0318 14:27:40.440034 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25edfd8fc19ae45b2c85ed1debb5336e3f48f9b9c0df548b59ca26cb782ab942\": container with ID starting with 25edfd8fc19ae45b2c85ed1debb5336e3f48f9b9c0df548b59ca26cb782ab942 not found: ID does not exist" containerID="25edfd8fc19ae45b2c85ed1debb5336e3f48f9b9c0df548b59ca26cb782ab942" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.440063 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25edfd8fc19ae45b2c85ed1debb5336e3f48f9b9c0df548b59ca26cb782ab942"} err="failed to get container status \"25edfd8fc19ae45b2c85ed1debb5336e3f48f9b9c0df548b59ca26cb782ab942\": rpc error: code = NotFound desc = could not find container \"25edfd8fc19ae45b2c85ed1debb5336e3f48f9b9c0df548b59ca26cb782ab942\": container with ID starting with 25edfd8fc19ae45b2c85ed1debb5336e3f48f9b9c0df548b59ca26cb782ab942 not found: ID does not exist" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.440083 4756 scope.go:117] "RemoveContainer" containerID="8a17dbf6f1a3b2b7b42d46a9bfef44c0ab8c62f561cdf8f584f88a8cabaa7da1" Mar 18 14:27:40 crc kubenswrapper[4756]: E0318 14:27:40.443183 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a17dbf6f1a3b2b7b42d46a9bfef44c0ab8c62f561cdf8f584f88a8cabaa7da1\": container with ID starting with 8a17dbf6f1a3b2b7b42d46a9bfef44c0ab8c62f561cdf8f584f88a8cabaa7da1 not found: ID does not exist" containerID="8a17dbf6f1a3b2b7b42d46a9bfef44c0ab8c62f561cdf8f584f88a8cabaa7da1" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.443208 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a17dbf6f1a3b2b7b42d46a9bfef44c0ab8c62f561cdf8f584f88a8cabaa7da1"} err="failed to get container status \"8a17dbf6f1a3b2b7b42d46a9bfef44c0ab8c62f561cdf8f584f88a8cabaa7da1\": rpc error: code = NotFound desc = could not find container \"8a17dbf6f1a3b2b7b42d46a9bfef44c0ab8c62f561cdf8f584f88a8cabaa7da1\": container with ID starting with 8a17dbf6f1a3b2b7b42d46a9bfef44c0ab8c62f561cdf8f584f88a8cabaa7da1 not found: ID does not exist" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.443224 4756 scope.go:117] "RemoveContainer" containerID="6c414f8f1f13dd4bd115061d92c7d1ea708da3c2738fb6694059b60344da07e9" Mar 18 14:27:40 crc kubenswrapper[4756]: E0318 14:27:40.443493 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c414f8f1f13dd4bd115061d92c7d1ea708da3c2738fb6694059b60344da07e9\": container with ID starting with 6c414f8f1f13dd4bd115061d92c7d1ea708da3c2738fb6694059b60344da07e9 not found: ID does not exist" containerID="6c414f8f1f13dd4bd115061d92c7d1ea708da3c2738fb6694059b60344da07e9" Mar 18 14:27:40 crc kubenswrapper[4756]: I0318 14:27:40.443516 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c414f8f1f13dd4bd115061d92c7d1ea708da3c2738fb6694059b60344da07e9"} err="failed to get container status \"6c414f8f1f13dd4bd115061d92c7d1ea708da3c2738fb6694059b60344da07e9\": rpc error: code = NotFound desc = could not find container \"6c414f8f1f13dd4bd115061d92c7d1ea708da3c2738fb6694059b60344da07e9\": container with ID starting with 6c414f8f1f13dd4bd115061d92c7d1ea708da3c2738fb6694059b60344da07e9 not found: ID does not exist" Mar 18 14:27:41 crc kubenswrapper[4756]: I0318 14:27:41.327217 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="847f8e82-73d7-4adf-8dfd-fadc6305cdf1" path="/var/lib/kubelet/pods/847f8e82-73d7-4adf-8dfd-fadc6305cdf1/volumes" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.171041 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564068-4xkm4"] Mar 18 14:28:00 crc kubenswrapper[4756]: E0318 14:28:00.171998 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847f8e82-73d7-4adf-8dfd-fadc6305cdf1" containerName="extract-content" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.172010 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="847f8e82-73d7-4adf-8dfd-fadc6305cdf1" containerName="extract-content" Mar 18 14:28:00 crc kubenswrapper[4756]: E0318 14:28:00.172024 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847f8e82-73d7-4adf-8dfd-fadc6305cdf1" containerName="registry-server" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.172030 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="847f8e82-73d7-4adf-8dfd-fadc6305cdf1" containerName="registry-server" Mar 18 14:28:00 crc kubenswrapper[4756]: E0318 14:28:00.172057 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847f8e82-73d7-4adf-8dfd-fadc6305cdf1" containerName="extract-utilities" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.172063 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="847f8e82-73d7-4adf-8dfd-fadc6305cdf1" containerName="extract-utilities" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.172286 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="847f8e82-73d7-4adf-8dfd-fadc6305cdf1" containerName="registry-server" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.173038 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564068-4xkm4" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.188792 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.189006 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.189205 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.215172 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564068-4xkm4"] Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.329891 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmpcq\" (UniqueName: \"kubernetes.io/projected/71b5953e-a4c2-44bd-ae8c-74ded1ebba07-kube-api-access-qmpcq\") pod \"auto-csr-approver-29564068-4xkm4\" (UID: \"71b5953e-a4c2-44bd-ae8c-74ded1ebba07\") " pod="openshift-infra/auto-csr-approver-29564068-4xkm4" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.431653 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmpcq\" (UniqueName: \"kubernetes.io/projected/71b5953e-a4c2-44bd-ae8c-74ded1ebba07-kube-api-access-qmpcq\") pod \"auto-csr-approver-29564068-4xkm4\" (UID: \"71b5953e-a4c2-44bd-ae8c-74ded1ebba07\") " pod="openshift-infra/auto-csr-approver-29564068-4xkm4" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.452822 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmpcq\" (UniqueName: \"kubernetes.io/projected/71b5953e-a4c2-44bd-ae8c-74ded1ebba07-kube-api-access-qmpcq\") pod \"auto-csr-approver-29564068-4xkm4\" (UID: \"71b5953e-a4c2-44bd-ae8c-74ded1ebba07\") " pod="openshift-infra/auto-csr-approver-29564068-4xkm4" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.514248 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564068-4xkm4" Mar 18 14:28:00 crc kubenswrapper[4756]: I0318 14:28:00.998783 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564068-4xkm4"] Mar 18 14:28:01 crc kubenswrapper[4756]: I0318 14:28:01.524083 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564068-4xkm4" event={"ID":"71b5953e-a4c2-44bd-ae8c-74ded1ebba07","Type":"ContainerStarted","Data":"ad972d29c04a0b663d8c73839cabfa9d6d8057afa693bf1b0a93c84baed7567e"} Mar 18 14:28:02 crc kubenswrapper[4756]: I0318 14:28:02.534440 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564068-4xkm4" event={"ID":"71b5953e-a4c2-44bd-ae8c-74ded1ebba07","Type":"ContainerStarted","Data":"429e283dccfbbf95903a51f71e72765e9d92ca4abec276da08636ca53dd71fce"} Mar 18 14:28:02 crc kubenswrapper[4756]: I0318 14:28:02.553060 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564068-4xkm4" podStartSLOduration=1.370237314 podStartE2EDuration="2.553043935s" podCreationTimestamp="2026-03-18 14:28:00 +0000 UTC" firstStartedPulling="2026-03-18 14:28:01.02508894 +0000 UTC m=+1682.339506915" lastFinishedPulling="2026-03-18 14:28:02.207895561 +0000 UTC m=+1683.522313536" observedRunningTime="2026-03-18 14:28:02.545803479 +0000 UTC m=+1683.860221454" watchObservedRunningTime="2026-03-18 14:28:02.553043935 +0000 UTC m=+1683.867461910" Mar 18 14:28:03 crc kubenswrapper[4756]: I0318 14:28:03.544662 4756 generic.go:334] "Generic (PLEG): container finished" podID="71b5953e-a4c2-44bd-ae8c-74ded1ebba07" containerID="429e283dccfbbf95903a51f71e72765e9d92ca4abec276da08636ca53dd71fce" exitCode=0 Mar 18 14:28:03 crc kubenswrapper[4756]: I0318 14:28:03.544754 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564068-4xkm4" event={"ID":"71b5953e-a4c2-44bd-ae8c-74ded1ebba07","Type":"ContainerDied","Data":"429e283dccfbbf95903a51f71e72765e9d92ca4abec276da08636ca53dd71fce"} Mar 18 14:28:05 crc kubenswrapper[4756]: I0318 14:28:05.457820 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564068-4xkm4" Mar 18 14:28:05 crc kubenswrapper[4756]: I0318 14:28:05.568757 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564068-4xkm4" event={"ID":"71b5953e-a4c2-44bd-ae8c-74ded1ebba07","Type":"ContainerDied","Data":"ad972d29c04a0b663d8c73839cabfa9d6d8057afa693bf1b0a93c84baed7567e"} Mar 18 14:28:05 crc kubenswrapper[4756]: I0318 14:28:05.568801 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad972d29c04a0b663d8c73839cabfa9d6d8057afa693bf1b0a93c84baed7567e" Mar 18 14:28:05 crc kubenswrapper[4756]: I0318 14:28:05.568915 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564068-4xkm4" Mar 18 14:28:05 crc kubenswrapper[4756]: I0318 14:28:05.612698 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564062-rj85m"] Mar 18 14:28:05 crc kubenswrapper[4756]: I0318 14:28:05.623659 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564062-rj85m"] Mar 18 14:28:05 crc kubenswrapper[4756]: I0318 14:28:05.646819 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmpcq\" (UniqueName: \"kubernetes.io/projected/71b5953e-a4c2-44bd-ae8c-74ded1ebba07-kube-api-access-qmpcq\") pod \"71b5953e-a4c2-44bd-ae8c-74ded1ebba07\" (UID: \"71b5953e-a4c2-44bd-ae8c-74ded1ebba07\") " Mar 18 14:28:05 crc kubenswrapper[4756]: I0318 14:28:05.654523 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b5953e-a4c2-44bd-ae8c-74ded1ebba07-kube-api-access-qmpcq" (OuterVolumeSpecName: "kube-api-access-qmpcq") pod "71b5953e-a4c2-44bd-ae8c-74ded1ebba07" (UID: "71b5953e-a4c2-44bd-ae8c-74ded1ebba07"). InnerVolumeSpecName "kube-api-access-qmpcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:28:05 crc kubenswrapper[4756]: I0318 14:28:05.749774 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmpcq\" (UniqueName: \"kubernetes.io/projected/71b5953e-a4c2-44bd-ae8c-74ded1ebba07-kube-api-access-qmpcq\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:07 crc kubenswrapper[4756]: I0318 14:28:07.325920 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cc376c5-de8d-4577-913c-ba2ed9d1bc75" path="/var/lib/kubelet/pods/5cc376c5-de8d-4577-913c-ba2ed9d1bc75/volumes" Mar 18 14:28:29 crc kubenswrapper[4756]: I0318 14:28:29.941773 4756 scope.go:117] "RemoveContainer" containerID="4daeea50cec98c5deb566ebece1d6febddb5cded68a2ecf68c2ff6f17ca9ff3c" Mar 18 14:28:29 crc kubenswrapper[4756]: I0318 14:28:29.983619 4756 scope.go:117] "RemoveContainer" containerID="329468c575a50e3429f544e362dd65cdb79dc698e77fa8f01bfa0c996edd253c" Mar 18 14:28:30 crc kubenswrapper[4756]: I0318 14:28:30.045900 4756 scope.go:117] "RemoveContainer" containerID="2b72e01f365e79dca0460075d23a884762ad96f2454371ab9b7cb65b691fecde" Mar 18 14:28:30 crc kubenswrapper[4756]: I0318 14:28:30.069671 4756 scope.go:117] "RemoveContainer" containerID="fdc6356e5a4853b64e2f05704d770128b5465e8a47e7f010752fabaa5d058615" Mar 18 14:28:30 crc kubenswrapper[4756]: I0318 14:28:30.117248 4756 scope.go:117] "RemoveContainer" containerID="f723e226feb9712ad1d42e121fc0cf04b0a08349c3222be90d04b7860a7d6695" Mar 18 14:28:30 crc kubenswrapper[4756]: I0318 14:28:30.166320 4756 scope.go:117] "RemoveContainer" containerID="ec76b4cea8296c216664ec5ba1e3c126b34d14be347f1fd7b7ad8daaf3003eeb" Mar 18 14:28:36 crc kubenswrapper[4756]: I0318 14:28:36.915286 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:28:36 crc kubenswrapper[4756]: I0318 14:28:36.915889 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:29:06 crc kubenswrapper[4756]: I0318 14:29:06.915549 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:29:06 crc kubenswrapper[4756]: I0318 14:29:06.916033 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:29:36 crc kubenswrapper[4756]: I0318 14:29:36.914853 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:29:36 crc kubenswrapper[4756]: I0318 14:29:36.915453 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:29:36 crc kubenswrapper[4756]: I0318 14:29:36.915499 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:29:36 crc kubenswrapper[4756]: I0318 14:29:36.916262 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:29:36 crc kubenswrapper[4756]: I0318 14:29:36.916321 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" gracePeriod=600 Mar 18 14:29:37 crc kubenswrapper[4756]: E0318 14:29:37.049436 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:29:37 crc kubenswrapper[4756]: I0318 14:29:37.489078 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" exitCode=0 Mar 18 14:29:37 crc kubenswrapper[4756]: I0318 14:29:37.489143 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323"} Mar 18 14:29:37 crc kubenswrapper[4756]: I0318 14:29:37.489182 4756 scope.go:117] "RemoveContainer" containerID="434d4b042e3195c964b5c61982de1a71dbd601937856a288545bb36d7cbe0017" Mar 18 14:29:37 crc kubenswrapper[4756]: I0318 14:29:37.489907 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:29:37 crc kubenswrapper[4756]: E0318 14:29:37.490349 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:29:51 crc kubenswrapper[4756]: I0318 14:29:51.316203 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:29:51 crc kubenswrapper[4756]: E0318 14:29:51.317057 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:29:57 crc kubenswrapper[4756]: I0318 14:29:57.670932 4756 generic.go:334] "Generic (PLEG): container finished" podID="06faf283-4cbe-459f-81b6-ca3f598ae5b0" containerID="00ef45778e1976480d20af21399cc2e858e5844bda55a7a09a62c05d06b98e13" exitCode=0 Mar 18 14:29:57 crc kubenswrapper[4756]: I0318 14:29:57.671206 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" event={"ID":"06faf283-4cbe-459f-81b6-ca3f598ae5b0","Type":"ContainerDied","Data":"00ef45778e1976480d20af21399cc2e858e5844bda55a7a09a62c05d06b98e13"} Mar 18 14:29:59 crc kubenswrapper[4756]: I0318 14:29:59.689238 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" event={"ID":"06faf283-4cbe-459f-81b6-ca3f598ae5b0","Type":"ContainerDied","Data":"47952848748ae6d28ac9a7b8524a3ca351974f31b9d95c6e58d622e1ed3ddbae"} Mar 18 14:29:59 crc kubenswrapper[4756]: I0318 14:29:59.689888 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47952848748ae6d28ac9a7b8524a3ca351974f31b9d95c6e58d622e1ed3ddbae" Mar 18 14:29:59 crc kubenswrapper[4756]: I0318 14:29:59.746235 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:29:59 crc kubenswrapper[4756]: I0318 14:29:59.934632 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-ssh-key-openstack-edpm-ipam\") pod \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " Mar 18 14:29:59 crc kubenswrapper[4756]: I0318 14:29:59.934701 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-bootstrap-combined-ca-bundle\") pod \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " Mar 18 14:29:59 crc kubenswrapper[4756]: I0318 14:29:59.934764 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfqjv\" (UniqueName: \"kubernetes.io/projected/06faf283-4cbe-459f-81b6-ca3f598ae5b0-kube-api-access-bfqjv\") pod \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " Mar 18 14:29:59 crc kubenswrapper[4756]: I0318 14:29:59.934789 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-inventory\") pod \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\" (UID: \"06faf283-4cbe-459f-81b6-ca3f598ae5b0\") " Mar 18 14:29:59 crc kubenswrapper[4756]: I0318 14:29:59.941679 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06faf283-4cbe-459f-81b6-ca3f598ae5b0-kube-api-access-bfqjv" (OuterVolumeSpecName: "kube-api-access-bfqjv") pod "06faf283-4cbe-459f-81b6-ca3f598ae5b0" (UID: "06faf283-4cbe-459f-81b6-ca3f598ae5b0"). InnerVolumeSpecName "kube-api-access-bfqjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:29:59 crc kubenswrapper[4756]: I0318 14:29:59.943298 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "06faf283-4cbe-459f-81b6-ca3f598ae5b0" (UID: "06faf283-4cbe-459f-81b6-ca3f598ae5b0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:29:59 crc kubenswrapper[4756]: I0318 14:29:59.963435 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "06faf283-4cbe-459f-81b6-ca3f598ae5b0" (UID: "06faf283-4cbe-459f-81b6-ca3f598ae5b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:29:59 crc kubenswrapper[4756]: I0318 14:29:59.963689 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-inventory" (OuterVolumeSpecName: "inventory") pod "06faf283-4cbe-459f-81b6-ca3f598ae5b0" (UID: "06faf283-4cbe-459f-81b6-ca3f598ae5b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.036678 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.036715 4756 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.036726 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfqjv\" (UniqueName: \"kubernetes.io/projected/06faf283-4cbe-459f-81b6-ca3f598ae5b0-kube-api-access-bfqjv\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.036736 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06faf283-4cbe-459f-81b6-ca3f598ae5b0-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.147248 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4"] Mar 18 14:30:00 crc kubenswrapper[4756]: E0318 14:30:00.147854 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b5953e-a4c2-44bd-ae8c-74ded1ebba07" containerName="oc" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.147883 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b5953e-a4c2-44bd-ae8c-74ded1ebba07" containerName="oc" Mar 18 14:30:00 crc kubenswrapper[4756]: E0318 14:30:00.147942 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06faf283-4cbe-459f-81b6-ca3f598ae5b0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.147953 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="06faf283-4cbe-459f-81b6-ca3f598ae5b0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.148269 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b5953e-a4c2-44bd-ae8c-74ded1ebba07" containerName="oc" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.148320 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="06faf283-4cbe-459f-81b6-ca3f598ae5b0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.149356 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.153358 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.157499 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.157738 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564070-6fr7g"] Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.159700 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564070-6fr7g" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.166466 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.166640 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.166746 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.166848 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4"] Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.178013 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564070-6fr7g"] Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.346850 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hvg4\" (UniqueName: \"kubernetes.io/projected/437200a1-9595-448c-8f6f-9612f8ca2e2e-kube-api-access-9hvg4\") pod \"auto-csr-approver-29564070-6fr7g\" (UID: \"437200a1-9595-448c-8f6f-9612f8ca2e2e\") " pod="openshift-infra/auto-csr-approver-29564070-6fr7g" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.347239 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bbe5b527-3c19-4691-9a4a-0344f5d43860-secret-volume\") pod \"collect-profiles-29564070-w95q4\" (UID: \"bbe5b527-3c19-4691-9a4a-0344f5d43860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.347300 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbe5b527-3c19-4691-9a4a-0344f5d43860-config-volume\") pod \"collect-profiles-29564070-w95q4\" (UID: \"bbe5b527-3c19-4691-9a4a-0344f5d43860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.347444 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6q4g\" (UniqueName: \"kubernetes.io/projected/bbe5b527-3c19-4691-9a4a-0344f5d43860-kube-api-access-b6q4g\") pod \"collect-profiles-29564070-w95q4\" (UID: \"bbe5b527-3c19-4691-9a4a-0344f5d43860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.449112 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6q4g\" (UniqueName: \"kubernetes.io/projected/bbe5b527-3c19-4691-9a4a-0344f5d43860-kube-api-access-b6q4g\") pod \"collect-profiles-29564070-w95q4\" (UID: \"bbe5b527-3c19-4691-9a4a-0344f5d43860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.449505 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hvg4\" (UniqueName: \"kubernetes.io/projected/437200a1-9595-448c-8f6f-9612f8ca2e2e-kube-api-access-9hvg4\") pod \"auto-csr-approver-29564070-6fr7g\" (UID: \"437200a1-9595-448c-8f6f-9612f8ca2e2e\") " pod="openshift-infra/auto-csr-approver-29564070-6fr7g" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.449589 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bbe5b527-3c19-4691-9a4a-0344f5d43860-secret-volume\") pod \"collect-profiles-29564070-w95q4\" (UID: \"bbe5b527-3c19-4691-9a4a-0344f5d43860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.449609 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbe5b527-3c19-4691-9a4a-0344f5d43860-config-volume\") pod \"collect-profiles-29564070-w95q4\" (UID: \"bbe5b527-3c19-4691-9a4a-0344f5d43860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.450574 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbe5b527-3c19-4691-9a4a-0344f5d43860-config-volume\") pod \"collect-profiles-29564070-w95q4\" (UID: \"bbe5b527-3c19-4691-9a4a-0344f5d43860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.453178 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bbe5b527-3c19-4691-9a4a-0344f5d43860-secret-volume\") pod \"collect-profiles-29564070-w95q4\" (UID: \"bbe5b527-3c19-4691-9a4a-0344f5d43860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.466942 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6q4g\" (UniqueName: \"kubernetes.io/projected/bbe5b527-3c19-4691-9a4a-0344f5d43860-kube-api-access-b6q4g\") pod \"collect-profiles-29564070-w95q4\" (UID: \"bbe5b527-3c19-4691-9a4a-0344f5d43860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.469559 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hvg4\" (UniqueName: \"kubernetes.io/projected/437200a1-9595-448c-8f6f-9612f8ca2e2e-kube-api-access-9hvg4\") pod \"auto-csr-approver-29564070-6fr7g\" (UID: \"437200a1-9595-448c-8f6f-9612f8ca2e2e\") " pod="openshift-infra/auto-csr-approver-29564070-6fr7g" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.495905 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564070-6fr7g" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.696405 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.768061 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.850332 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr"] Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.851878 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.859651 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.859840 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.859971 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.860078 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.861048 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr"] Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.963469 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/570b883b-276f-43d9-983b-3f99763e7e4d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr\" (UID: \"570b883b-276f-43d9-983b-3f99763e7e4d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.963675 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/570b883b-276f-43d9-983b-3f99763e7e4d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr\" (UID: \"570b883b-276f-43d9-983b-3f99763e7e4d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" Mar 18 14:30:00 crc kubenswrapper[4756]: I0318 14:30:00.963700 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlr96\" (UniqueName: \"kubernetes.io/projected/570b883b-276f-43d9-983b-3f99763e7e4d-kube-api-access-nlr96\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr\" (UID: \"570b883b-276f-43d9-983b-3f99763e7e4d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.065700 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/570b883b-276f-43d9-983b-3f99763e7e4d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr\" (UID: \"570b883b-276f-43d9-983b-3f99763e7e4d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.065780 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlr96\" (UniqueName: \"kubernetes.io/projected/570b883b-276f-43d9-983b-3f99763e7e4d-kube-api-access-nlr96\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr\" (UID: \"570b883b-276f-43d9-983b-3f99763e7e4d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.066190 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/570b883b-276f-43d9-983b-3f99763e7e4d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr\" (UID: \"570b883b-276f-43d9-983b-3f99763e7e4d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.070996 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/570b883b-276f-43d9-983b-3f99763e7e4d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr\" (UID: \"570b883b-276f-43d9-983b-3f99763e7e4d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.071263 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/570b883b-276f-43d9-983b-3f99763e7e4d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr\" (UID: \"570b883b-276f-43d9-983b-3f99763e7e4d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.081069 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlr96\" (UniqueName: \"kubernetes.io/projected/570b883b-276f-43d9-983b-3f99763e7e4d-kube-api-access-nlr96\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr\" (UID: \"570b883b-276f-43d9-983b-3f99763e7e4d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.138895 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564070-6fr7g"] Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.197620 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.491692 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4"] Mar 18 14:30:01 crc kubenswrapper[4756]: W0318 14:30:01.493592 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbe5b527_3c19_4691_9a4a_0344f5d43860.slice/crio-3349c56de2735db46f31ee2b7e9cab243e86a43b688bb49a81d123bce697e568 WatchSource:0}: Error finding container 3349c56de2735db46f31ee2b7e9cab243e86a43b688bb49a81d123bce697e568: Status 404 returned error can't find the container with id 3349c56de2735db46f31ee2b7e9cab243e86a43b688bb49a81d123bce697e568 Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.710806 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564070-6fr7g" event={"ID":"437200a1-9595-448c-8f6f-9612f8ca2e2e","Type":"ContainerStarted","Data":"461f20f2db8612bb5a9189f7d45d266f83885d8a9eb99dc23fad07f3b2b7d20e"} Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.718290 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" event={"ID":"bbe5b527-3c19-4691-9a4a-0344f5d43860","Type":"ContainerStarted","Data":"26e72f6c2521419842de2d7a3e2b1f24da88fd441da5c199cb0fe359ef143e36"} Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.718346 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" event={"ID":"bbe5b527-3c19-4691-9a4a-0344f5d43860","Type":"ContainerStarted","Data":"3349c56de2735db46f31ee2b7e9cab243e86a43b688bb49a81d123bce697e568"} Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.739890 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" podStartSLOduration=1.739869506 podStartE2EDuration="1.739869506s" podCreationTimestamp="2026-03-18 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:30:01.736608208 +0000 UTC m=+1803.051026183" watchObservedRunningTime="2026-03-18 14:30:01.739869506 +0000 UTC m=+1803.054287491" Mar 18 14:30:01 crc kubenswrapper[4756]: I0318 14:30:01.770920 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr"] Mar 18 14:30:01 crc kubenswrapper[4756]: W0318 14:30:01.775731 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod570b883b_276f_43d9_983b_3f99763e7e4d.slice/crio-fb6ec4fbdd66519f9b1a1ba0c545aa1868a2183a048ed3cf03c1effec2e12a82 WatchSource:0}: Error finding container fb6ec4fbdd66519f9b1a1ba0c545aa1868a2183a048ed3cf03c1effec2e12a82: Status 404 returned error can't find the container with id fb6ec4fbdd66519f9b1a1ba0c545aa1868a2183a048ed3cf03c1effec2e12a82 Mar 18 14:30:02 crc kubenswrapper[4756]: I0318 14:30:02.728873 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" event={"ID":"570b883b-276f-43d9-983b-3f99763e7e4d","Type":"ContainerStarted","Data":"3b226e1f574b597466ba429c7cec1653e3a640f6a12e7d752053b1b936e989b4"} Mar 18 14:30:02 crc kubenswrapper[4756]: I0318 14:30:02.729360 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" event={"ID":"570b883b-276f-43d9-983b-3f99763e7e4d","Type":"ContainerStarted","Data":"fb6ec4fbdd66519f9b1a1ba0c545aa1868a2183a048ed3cf03c1effec2e12a82"} Mar 18 14:30:02 crc kubenswrapper[4756]: I0318 14:30:02.730343 4756 generic.go:334] "Generic (PLEG): container finished" podID="bbe5b527-3c19-4691-9a4a-0344f5d43860" containerID="26e72f6c2521419842de2d7a3e2b1f24da88fd441da5c199cb0fe359ef143e36" exitCode=0 Mar 18 14:30:02 crc kubenswrapper[4756]: I0318 14:30:02.730391 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" event={"ID":"bbe5b527-3c19-4691-9a4a-0344f5d43860","Type":"ContainerDied","Data":"26e72f6c2521419842de2d7a3e2b1f24da88fd441da5c199cb0fe359ef143e36"} Mar 18 14:30:02 crc kubenswrapper[4756]: I0318 14:30:02.751522 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" podStartSLOduration=2.075080945 podStartE2EDuration="2.751502647s" podCreationTimestamp="2026-03-18 14:30:00 +0000 UTC" firstStartedPulling="2026-03-18 14:30:01.77848987 +0000 UTC m=+1803.092907855" lastFinishedPulling="2026-03-18 14:30:02.454911582 +0000 UTC m=+1803.769329557" observedRunningTime="2026-03-18 14:30:02.743331517 +0000 UTC m=+1804.057749502" watchObservedRunningTime="2026-03-18 14:30:02.751502647 +0000 UTC m=+1804.065920622" Mar 18 14:30:03 crc kubenswrapper[4756]: I0318 14:30:03.316035 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:30:03 crc kubenswrapper[4756]: E0318 14:30:03.316657 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.628136 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.747091 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bbe5b527-3c19-4691-9a4a-0344f5d43860-secret-volume\") pod \"bbe5b527-3c19-4691-9a4a-0344f5d43860\" (UID: \"bbe5b527-3c19-4691-9a4a-0344f5d43860\") " Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.747446 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbe5b527-3c19-4691-9a4a-0344f5d43860-config-volume\") pod \"bbe5b527-3c19-4691-9a4a-0344f5d43860\" (UID: \"bbe5b527-3c19-4691-9a4a-0344f5d43860\") " Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.747589 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6q4g\" (UniqueName: \"kubernetes.io/projected/bbe5b527-3c19-4691-9a4a-0344f5d43860-kube-api-access-b6q4g\") pod \"bbe5b527-3c19-4691-9a4a-0344f5d43860\" (UID: \"bbe5b527-3c19-4691-9a4a-0344f5d43860\") " Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.748061 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbe5b527-3c19-4691-9a4a-0344f5d43860-config-volume" (OuterVolumeSpecName: "config-volume") pod "bbe5b527-3c19-4691-9a4a-0344f5d43860" (UID: "bbe5b527-3c19-4691-9a4a-0344f5d43860"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.752698 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbe5b527-3c19-4691-9a4a-0344f5d43860-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.754692 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbe5b527-3c19-4691-9a4a-0344f5d43860-kube-api-access-b6q4g" (OuterVolumeSpecName: "kube-api-access-b6q4g") pod "bbe5b527-3c19-4691-9a4a-0344f5d43860" (UID: "bbe5b527-3c19-4691-9a4a-0344f5d43860"). InnerVolumeSpecName "kube-api-access-b6q4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.757431 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe5b527-3c19-4691-9a4a-0344f5d43860-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bbe5b527-3c19-4691-9a4a-0344f5d43860" (UID: "bbe5b527-3c19-4691-9a4a-0344f5d43860"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.762653 4756 generic.go:334] "Generic (PLEG): container finished" podID="437200a1-9595-448c-8f6f-9612f8ca2e2e" containerID="dd068bd5d7c56f6e465654a61fd010916bdfaaaab2c4861d7bdc786d55e07e99" exitCode=0 Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.762741 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564070-6fr7g" event={"ID":"437200a1-9595-448c-8f6f-9612f8ca2e2e","Type":"ContainerDied","Data":"dd068bd5d7c56f6e465654a61fd010916bdfaaaab2c4861d7bdc786d55e07e99"} Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.770007 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" event={"ID":"bbe5b527-3c19-4691-9a4a-0344f5d43860","Type":"ContainerDied","Data":"3349c56de2735db46f31ee2b7e9cab243e86a43b688bb49a81d123bce697e568"} Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.770052 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3349c56de2735db46f31ee2b7e9cab243e86a43b688bb49a81d123bce697e568" Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.770109 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-w95q4" Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.856900 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bbe5b527-3c19-4691-9a4a-0344f5d43860-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:04 crc kubenswrapper[4756]: I0318 14:30:04.856929 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6q4g\" (UniqueName: \"kubernetes.io/projected/bbe5b527-3c19-4691-9a4a-0344f5d43860-kube-api-access-b6q4g\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:06 crc kubenswrapper[4756]: I0318 14:30:06.692683 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564070-6fr7g" Mar 18 14:30:06 crc kubenswrapper[4756]: I0318 14:30:06.789415 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564070-6fr7g" event={"ID":"437200a1-9595-448c-8f6f-9612f8ca2e2e","Type":"ContainerDied","Data":"461f20f2db8612bb5a9189f7d45d266f83885d8a9eb99dc23fad07f3b2b7d20e"} Mar 18 14:30:06 crc kubenswrapper[4756]: I0318 14:30:06.789661 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="461f20f2db8612bb5a9189f7d45d266f83885d8a9eb99dc23fad07f3b2b7d20e" Mar 18 14:30:06 crc kubenswrapper[4756]: I0318 14:30:06.789473 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564070-6fr7g" Mar 18 14:30:06 crc kubenswrapper[4756]: I0318 14:30:06.798076 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hvg4\" (UniqueName: \"kubernetes.io/projected/437200a1-9595-448c-8f6f-9612f8ca2e2e-kube-api-access-9hvg4\") pod \"437200a1-9595-448c-8f6f-9612f8ca2e2e\" (UID: \"437200a1-9595-448c-8f6f-9612f8ca2e2e\") " Mar 18 14:30:06 crc kubenswrapper[4756]: I0318 14:30:06.804847 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/437200a1-9595-448c-8f6f-9612f8ca2e2e-kube-api-access-9hvg4" (OuterVolumeSpecName: "kube-api-access-9hvg4") pod "437200a1-9595-448c-8f6f-9612f8ca2e2e" (UID: "437200a1-9595-448c-8f6f-9612f8ca2e2e"). InnerVolumeSpecName "kube-api-access-9hvg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:30:06 crc kubenswrapper[4756]: I0318 14:30:06.899586 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hvg4\" (UniqueName: \"kubernetes.io/projected/437200a1-9595-448c-8f6f-9612f8ca2e2e-kube-api-access-9hvg4\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:07 crc kubenswrapper[4756]: I0318 14:30:07.777098 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564064-wt6kw"] Mar 18 14:30:07 crc kubenswrapper[4756]: I0318 14:30:07.788323 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564064-wt6kw"] Mar 18 14:30:09 crc kubenswrapper[4756]: I0318 14:30:09.326369 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7958dad1-8aea-4918-be5f-9cfd91a229a9" path="/var/lib/kubelet/pods/7958dad1-8aea-4918-be5f-9cfd91a229a9/volumes" Mar 18 14:30:16 crc kubenswrapper[4756]: I0318 14:30:16.315774 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:30:16 crc kubenswrapper[4756]: E0318 14:30:16.317280 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:30:27 crc kubenswrapper[4756]: I0318 14:30:27.315506 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:30:27 crc kubenswrapper[4756]: E0318 14:30:27.316246 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:30:30 crc kubenswrapper[4756]: I0318 14:30:30.420523 4756 scope.go:117] "RemoveContainer" containerID="5727aa64c617c359d928b7d123101e3575db23ec6cea09c351ce25f4dee49f3e" Mar 18 14:30:38 crc kubenswrapper[4756]: I0318 14:30:38.315590 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:30:38 crc kubenswrapper[4756]: E0318 14:30:38.316570 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:30:44 crc kubenswrapper[4756]: I0318 14:30:44.055160 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-pkbmp"] Mar 18 14:30:44 crc kubenswrapper[4756]: I0318 14:30:44.071289 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-pkbmp"] Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.045185 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d1e7-account-create-update-6b52m"] Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.053942 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-dc6b-account-create-update-9pd9b"] Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.063273 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-dc6b-account-create-update-9pd9b"] Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.079007 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d1e7-account-create-update-6b52m"] Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.088173 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6bjq5"] Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.095822 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6bjq5"] Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.126840 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-11e9-account-create-update-zj46z"] Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.136662 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-j7cgx"] Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.147273 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-11e9-account-create-update-zj46z"] Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.158333 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-j7cgx"] Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.343261 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca56e1e-6fbd-4f49-8abf-3f3610879132" path="/var/lib/kubelet/pods/4ca56e1e-6fbd-4f49-8abf-3f3610879132/volumes" Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.365842 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b8c1050-ec08-4c1b-80a0-09461e459598" path="/var/lib/kubelet/pods/7b8c1050-ec08-4c1b-80a0-09461e459598/volumes" Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.391406 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f9f2f47-c757-4ed2-8cf0-a89921d6e48b" path="/var/lib/kubelet/pods/7f9f2f47-c757-4ed2-8cf0-a89921d6e48b/volumes" Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.396041 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5c3691b-930b-4cfe-b3c4-b5a5a6244e28" path="/var/lib/kubelet/pods/c5c3691b-930b-4cfe-b3c4-b5a5a6244e28/volumes" Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.396773 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9d20605-425f-4245-8f7a-7d0c95e24e25" path="/var/lib/kubelet/pods/f9d20605-425f-4245-8f7a-7d0c95e24e25/volumes" Mar 18 14:30:45 crc kubenswrapper[4756]: I0318 14:30:45.398253 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb9485ce-3028-491d-8149-55e39b14c5a5" path="/var/lib/kubelet/pods/fb9485ce-3028-491d-8149-55e39b14c5a5/volumes" Mar 18 14:30:52 crc kubenswrapper[4756]: I0318 14:30:52.315426 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:30:52 crc kubenswrapper[4756]: E0318 14:30:52.316190 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:30:55 crc kubenswrapper[4756]: I0318 14:30:55.027035 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7bbnt"] Mar 18 14:30:55 crc kubenswrapper[4756]: I0318 14:30:55.037192 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7bbnt"] Mar 18 14:30:55 crc kubenswrapper[4756]: I0318 14:30:55.325347 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dc602be-eea4-4a22-bfba-1f6319b32064" path="/var/lib/kubelet/pods/5dc602be-eea4-4a22-bfba-1f6319b32064/volumes" Mar 18 14:31:03 crc kubenswrapper[4756]: I0318 14:31:03.315272 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:31:03 crc kubenswrapper[4756]: E0318 14:31:03.316182 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:31:09 crc kubenswrapper[4756]: I0318 14:31:09.034716 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-g229b"] Mar 18 14:31:09 crc kubenswrapper[4756]: I0318 14:31:09.045466 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-g229b"] Mar 18 14:31:09 crc kubenswrapper[4756]: I0318 14:31:09.325960 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ee9241-87cd-43e5-90d2-869e14cc1eb6" path="/var/lib/kubelet/pods/a6ee9241-87cd-43e5-90d2-869e14cc1eb6/volumes" Mar 18 14:31:16 crc kubenswrapper[4756]: I0318 14:31:16.316029 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:31:16 crc kubenswrapper[4756]: E0318 14:31:16.316787 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.065208 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-jrvxc"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.075790 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-jrvxc"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.087504 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1934-account-create-update-d94f6"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.099186 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-8583-account-create-update-4lbfs"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.107753 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-f6cr8"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.118688 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-2e41-account-create-update-8hh7d"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.127258 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-6vpwc"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.136521 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d7ef-account-create-update-6xjvm"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.146019 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-dq884"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.156103 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d7ef-account-create-update-6xjvm"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.167702 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-2e41-account-create-update-8hh7d"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.178317 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-dq884"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.189692 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-8583-account-create-update-4lbfs"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.198489 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-6vpwc"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.211111 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-f6cr8"] Mar 18 14:31:24 crc kubenswrapper[4756]: I0318 14:31:24.224451 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-1934-account-create-update-d94f6"] Mar 18 14:31:25 crc kubenswrapper[4756]: I0318 14:31:25.332202 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c" path="/var/lib/kubelet/pods/0f3af1c0-3ab7-43b6-bcd6-7de7fcbf6a0c/volumes" Mar 18 14:31:25 crc kubenswrapper[4756]: I0318 14:31:25.333047 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95" path="/var/lib/kubelet/pods/1cf1d7f9-fba2-4ad5-a5c0-f92452e6ab95/volumes" Mar 18 14:31:25 crc kubenswrapper[4756]: I0318 14:31:25.333643 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45537606-07d7-49ee-abed-7aab10e9deab" path="/var/lib/kubelet/pods/45537606-07d7-49ee-abed-7aab10e9deab/volumes" Mar 18 14:31:25 crc kubenswrapper[4756]: I0318 14:31:25.334201 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="622c7c3a-8ac1-4286-8bcf-b1444608c489" path="/var/lib/kubelet/pods/622c7c3a-8ac1-4286-8bcf-b1444608c489/volumes" Mar 18 14:31:25 crc kubenswrapper[4756]: I0318 14:31:25.335170 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71dfb945-64cc-4994-bd4e-591797fc6ca8" path="/var/lib/kubelet/pods/71dfb945-64cc-4994-bd4e-591797fc6ca8/volumes" Mar 18 14:31:25 crc kubenswrapper[4756]: I0318 14:31:25.336539 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="859f4c18-f94d-49bd-9287-ffad04cbe5d9" path="/var/lib/kubelet/pods/859f4c18-f94d-49bd-9287-ffad04cbe5d9/volumes" Mar 18 14:31:25 crc kubenswrapper[4756]: I0318 14:31:25.337059 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4c68a42-f073-4460-82f4-d5ddaaa26b05" path="/var/lib/kubelet/pods/a4c68a42-f073-4460-82f4-d5ddaaa26b05/volumes" Mar 18 14:31:25 crc kubenswrapper[4756]: I0318 14:31:25.338012 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da559c8c-1db0-49af-9b45-22af3a40eccf" path="/var/lib/kubelet/pods/da559c8c-1db0-49af-9b45-22af3a40eccf/volumes" Mar 18 14:31:27 crc kubenswrapper[4756]: I0318 14:31:27.315613 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:31:27 crc kubenswrapper[4756]: E0318 14:31:27.316395 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:31:29 crc kubenswrapper[4756]: I0318 14:31:29.038573 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-xtr5q"] Mar 18 14:31:29 crc kubenswrapper[4756]: I0318 14:31:29.050032 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-xtr5q"] Mar 18 14:31:29 crc kubenswrapper[4756]: I0318 14:31:29.330990 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcdc6f55-de64-4698-9c24-35d42eca014c" path="/var/lib/kubelet/pods/fcdc6f55-de64-4698-9c24-35d42eca014c/volumes" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.502980 4756 scope.go:117] "RemoveContainer" containerID="1cf4bcbe151782e076c6abb900c4dcdc8fcb725ff66d62798995bbf35698cbab" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.530157 4756 scope.go:117] "RemoveContainer" containerID="7d138f83b8a277738e896134149968b7272f36905cd61120046eab4a3b6d7244" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.576305 4756 scope.go:117] "RemoveContainer" containerID="7070ad1d9e9382acec4593cdb6ce1d60bbc3cef73a7bd05778c420838d0c1ebf" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.638735 4756 scope.go:117] "RemoveContainer" containerID="958cf6673af65aee881a8f448643d206ce25761d6e903669449e3138065f1244" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.678337 4756 scope.go:117] "RemoveContainer" containerID="e21986a3f786c4c31e558ecbca97a99d022702e95cc250625d4368b344e0a9da" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.766680 4756 scope.go:117] "RemoveContainer" containerID="7c915df2dee3f4663a41dcd9a86ab2b7f72e2623788c2983f916df4f1db627b8" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.817330 4756 scope.go:117] "RemoveContainer" containerID="d075f3a42af1fadbd910d2c6f4615b217c87e1f13bc186fc591aec1a99858977" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.846604 4756 scope.go:117] "RemoveContainer" containerID="bd9a06e07640582b0cd16a84f32400dfa07f0a04141d345279c0f4ef3b7064d5" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.867410 4756 scope.go:117] "RemoveContainer" containerID="d326a16ba2ccdc49f389d48265ccc4a917abf90377c058387addb24c132684bc" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.890629 4756 scope.go:117] "RemoveContainer" containerID="9ea28543579c8a4a82ebf6cc16c9eaec55097eea58578b417baf645ca889972a" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.911822 4756 scope.go:117] "RemoveContainer" containerID="674550c08e0f9236e4da0f39dfa646f5c61785b2943b5b7e53e7ca5f2728776c" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.934795 4756 scope.go:117] "RemoveContainer" containerID="6fc9a7d1d02d193d67a567672382370fbce3d244cddd47ce909eae652a02e425" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.960338 4756 scope.go:117] "RemoveContainer" containerID="e5c2a81656bfe4293055d4c913767388205276cef98d6c9857c9a3afdff264d0" Mar 18 14:31:30 crc kubenswrapper[4756]: I0318 14:31:30.985757 4756 scope.go:117] "RemoveContainer" containerID="01ec4c5abb00ab9fa9100e70ad4ed31a2eeb7b94c0d9d5245285ad66338ffee0" Mar 18 14:31:31 crc kubenswrapper[4756]: I0318 14:31:31.005234 4756 scope.go:117] "RemoveContainer" containerID="3bd4f5b43753c02ab8a98e8a37967e6af8f6846631774c1152e9a280b7a35bd2" Mar 18 14:31:31 crc kubenswrapper[4756]: I0318 14:31:31.025353 4756 scope.go:117] "RemoveContainer" containerID="de0d5819c9180839002918cdaf9bfe6b093a031a737e6272c62c887f2f8a5717" Mar 18 14:31:31 crc kubenswrapper[4756]: I0318 14:31:31.051938 4756 scope.go:117] "RemoveContainer" containerID="68c334552cdf9d26b95d33d9b9928951ca2afe018c7d8befac930d53be5506f4" Mar 18 14:31:31 crc kubenswrapper[4756]: I0318 14:31:31.076111 4756 scope.go:117] "RemoveContainer" containerID="1cd38dc7d27a2683d688d561cae96bd617d30db38ab59c51082dfbe8585f365d" Mar 18 14:31:31 crc kubenswrapper[4756]: I0318 14:31:31.095432 4756 scope.go:117] "RemoveContainer" containerID="33fe0a061152e5b59c3f8a7bbaa77565e496e5d2f5d13ee7143400ef8cacc2e7" Mar 18 14:31:38 crc kubenswrapper[4756]: I0318 14:31:38.315998 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:31:38 crc kubenswrapper[4756]: E0318 14:31:38.317179 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:31:39 crc kubenswrapper[4756]: I0318 14:31:39.766917 4756 generic.go:334] "Generic (PLEG): container finished" podID="570b883b-276f-43d9-983b-3f99763e7e4d" containerID="3b226e1f574b597466ba429c7cec1653e3a640f6a12e7d752053b1b936e989b4" exitCode=0 Mar 18 14:31:39 crc kubenswrapper[4756]: I0318 14:31:39.767016 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" event={"ID":"570b883b-276f-43d9-983b-3f99763e7e4d","Type":"ContainerDied","Data":"3b226e1f574b597466ba429c7cec1653e3a640f6a12e7d752053b1b936e989b4"} Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.735033 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.791427 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" event={"ID":"570b883b-276f-43d9-983b-3f99763e7e4d","Type":"ContainerDied","Data":"fb6ec4fbdd66519f9b1a1ba0c545aa1868a2183a048ed3cf03c1effec2e12a82"} Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.791716 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb6ec4fbdd66519f9b1a1ba0c545aa1868a2183a048ed3cf03c1effec2e12a82" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.791774 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.824335 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/570b883b-276f-43d9-983b-3f99763e7e4d-inventory\") pod \"570b883b-276f-43d9-983b-3f99763e7e4d\" (UID: \"570b883b-276f-43d9-983b-3f99763e7e4d\") " Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.824472 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/570b883b-276f-43d9-983b-3f99763e7e4d-ssh-key-openstack-edpm-ipam\") pod \"570b883b-276f-43d9-983b-3f99763e7e4d\" (UID: \"570b883b-276f-43d9-983b-3f99763e7e4d\") " Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.824669 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlr96\" (UniqueName: \"kubernetes.io/projected/570b883b-276f-43d9-983b-3f99763e7e4d-kube-api-access-nlr96\") pod \"570b883b-276f-43d9-983b-3f99763e7e4d\" (UID: \"570b883b-276f-43d9-983b-3f99763e7e4d\") " Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.836514 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/570b883b-276f-43d9-983b-3f99763e7e4d-kube-api-access-nlr96" (OuterVolumeSpecName: "kube-api-access-nlr96") pod "570b883b-276f-43d9-983b-3f99763e7e4d" (UID: "570b883b-276f-43d9-983b-3f99763e7e4d"). InnerVolumeSpecName "kube-api-access-nlr96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.858298 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570b883b-276f-43d9-983b-3f99763e7e4d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "570b883b-276f-43d9-983b-3f99763e7e4d" (UID: "570b883b-276f-43d9-983b-3f99763e7e4d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.860533 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570b883b-276f-43d9-983b-3f99763e7e4d-inventory" (OuterVolumeSpecName: "inventory") pod "570b883b-276f-43d9-983b-3f99763e7e4d" (UID: "570b883b-276f-43d9-983b-3f99763e7e4d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.883991 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z"] Mar 18 14:31:41 crc kubenswrapper[4756]: E0318 14:31:41.884575 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="570b883b-276f-43d9-983b-3f99763e7e4d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.884599 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="570b883b-276f-43d9-983b-3f99763e7e4d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 14:31:41 crc kubenswrapper[4756]: E0318 14:31:41.884627 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe5b527-3c19-4691-9a4a-0344f5d43860" containerName="collect-profiles" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.884638 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe5b527-3c19-4691-9a4a-0344f5d43860" containerName="collect-profiles" Mar 18 14:31:41 crc kubenswrapper[4756]: E0318 14:31:41.884668 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437200a1-9595-448c-8f6f-9612f8ca2e2e" containerName="oc" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.884677 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="437200a1-9595-448c-8f6f-9612f8ca2e2e" containerName="oc" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.884931 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe5b527-3c19-4691-9a4a-0344f5d43860" containerName="collect-profiles" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.884965 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="570b883b-276f-43d9-983b-3f99763e7e4d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.884988 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="437200a1-9595-448c-8f6f-9612f8ca2e2e" containerName="oc" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.885957 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.896297 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z"] Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.927571 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlr96\" (UniqueName: \"kubernetes.io/projected/570b883b-276f-43d9-983b-3f99763e7e4d-kube-api-access-nlr96\") on node \"crc\" DevicePath \"\"" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.927616 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/570b883b-276f-43d9-983b-3f99763e7e4d-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:31:41 crc kubenswrapper[4756]: I0318 14:31:41.927631 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/570b883b-276f-43d9-983b-3f99763e7e4d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:31:42 crc kubenswrapper[4756]: I0318 14:31:42.028847 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/437705be-47a9-4902-9b5d-c8f293a3985e-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z\" (UID: \"437705be-47a9-4902-9b5d-c8f293a3985e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" Mar 18 14:31:42 crc kubenswrapper[4756]: I0318 14:31:42.028944 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwswv\" (UniqueName: \"kubernetes.io/projected/437705be-47a9-4902-9b5d-c8f293a3985e-kube-api-access-qwswv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z\" (UID: \"437705be-47a9-4902-9b5d-c8f293a3985e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" Mar 18 14:31:42 crc kubenswrapper[4756]: I0318 14:31:42.028975 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/437705be-47a9-4902-9b5d-c8f293a3985e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z\" (UID: \"437705be-47a9-4902-9b5d-c8f293a3985e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" Mar 18 14:31:42 crc kubenswrapper[4756]: I0318 14:31:42.131775 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/437705be-47a9-4902-9b5d-c8f293a3985e-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z\" (UID: \"437705be-47a9-4902-9b5d-c8f293a3985e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" Mar 18 14:31:42 crc kubenswrapper[4756]: I0318 14:31:42.131958 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwswv\" (UniqueName: \"kubernetes.io/projected/437705be-47a9-4902-9b5d-c8f293a3985e-kube-api-access-qwswv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z\" (UID: \"437705be-47a9-4902-9b5d-c8f293a3985e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" Mar 18 14:31:42 crc kubenswrapper[4756]: I0318 14:31:42.132008 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/437705be-47a9-4902-9b5d-c8f293a3985e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z\" (UID: \"437705be-47a9-4902-9b5d-c8f293a3985e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" Mar 18 14:31:42 crc kubenswrapper[4756]: I0318 14:31:42.136846 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/437705be-47a9-4902-9b5d-c8f293a3985e-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z\" (UID: \"437705be-47a9-4902-9b5d-c8f293a3985e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" Mar 18 14:31:42 crc kubenswrapper[4756]: I0318 14:31:42.137607 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/437705be-47a9-4902-9b5d-c8f293a3985e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z\" (UID: \"437705be-47a9-4902-9b5d-c8f293a3985e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" Mar 18 14:31:42 crc kubenswrapper[4756]: I0318 14:31:42.151206 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwswv\" (UniqueName: \"kubernetes.io/projected/437705be-47a9-4902-9b5d-c8f293a3985e-kube-api-access-qwswv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z\" (UID: \"437705be-47a9-4902-9b5d-c8f293a3985e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" Mar 18 14:31:42 crc kubenswrapper[4756]: I0318 14:31:42.259043 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" Mar 18 14:31:42 crc kubenswrapper[4756]: I0318 14:31:42.836669 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:31:42 crc kubenswrapper[4756]: I0318 14:31:42.836831 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z"] Mar 18 14:31:43 crc kubenswrapper[4756]: I0318 14:31:43.816404 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" event={"ID":"437705be-47a9-4902-9b5d-c8f293a3985e","Type":"ContainerStarted","Data":"244aaa3aebd49289006153985d4e1f37ea434c676107ad4ed0395f76452f2175"} Mar 18 14:31:43 crc kubenswrapper[4756]: I0318 14:31:43.816792 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" event={"ID":"437705be-47a9-4902-9b5d-c8f293a3985e","Type":"ContainerStarted","Data":"8e88b11300122b0ace4ef4dbcf76c86787535daf8b1dda1cd55a5b90f6a3e8c7"} Mar 18 14:31:43 crc kubenswrapper[4756]: I0318 14:31:43.845477 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" podStartSLOduration=2.186860549 podStartE2EDuration="2.845452289s" podCreationTimestamp="2026-03-18 14:31:41 +0000 UTC" firstStartedPulling="2026-03-18 14:31:42.836402777 +0000 UTC m=+1904.150820772" lastFinishedPulling="2026-03-18 14:31:43.494994537 +0000 UTC m=+1904.809412512" observedRunningTime="2026-03-18 14:31:43.832511559 +0000 UTC m=+1905.146929534" watchObservedRunningTime="2026-03-18 14:31:43.845452289 +0000 UTC m=+1905.159870274" Mar 18 14:31:52 crc kubenswrapper[4756]: I0318 14:31:52.315738 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:31:52 crc kubenswrapper[4756]: E0318 14:31:52.316918 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:31:59 crc kubenswrapper[4756]: I0318 14:31:59.055489 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-z7pth"] Mar 18 14:31:59 crc kubenswrapper[4756]: I0318 14:31:59.069813 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-z7pth"] Mar 18 14:31:59 crc kubenswrapper[4756]: I0318 14:31:59.334324 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb78a1f-b53a-4b17-8b91-a70da1bb9071" path="/var/lib/kubelet/pods/4bb78a1f-b53a-4b17-8b91-a70da1bb9071/volumes" Mar 18 14:32:00 crc kubenswrapper[4756]: I0318 14:32:00.191319 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564072-n29j8"] Mar 18 14:32:00 crc kubenswrapper[4756]: I0318 14:32:00.193891 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564072-n29j8" Mar 18 14:32:00 crc kubenswrapper[4756]: I0318 14:32:00.197290 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:32:00 crc kubenswrapper[4756]: I0318 14:32:00.197904 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:32:00 crc kubenswrapper[4756]: I0318 14:32:00.198955 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:32:00 crc kubenswrapper[4756]: I0318 14:32:00.213499 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564072-n29j8"] Mar 18 14:32:00 crc kubenswrapper[4756]: I0318 14:32:00.276263 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mndq5\" (UniqueName: \"kubernetes.io/projected/50619664-2536-4ae7-b393-863d44c0e69c-kube-api-access-mndq5\") pod \"auto-csr-approver-29564072-n29j8\" (UID: \"50619664-2536-4ae7-b393-863d44c0e69c\") " pod="openshift-infra/auto-csr-approver-29564072-n29j8" Mar 18 14:32:00 crc kubenswrapper[4756]: I0318 14:32:00.379942 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mndq5\" (UniqueName: \"kubernetes.io/projected/50619664-2536-4ae7-b393-863d44c0e69c-kube-api-access-mndq5\") pod \"auto-csr-approver-29564072-n29j8\" (UID: \"50619664-2536-4ae7-b393-863d44c0e69c\") " pod="openshift-infra/auto-csr-approver-29564072-n29j8" Mar 18 14:32:00 crc kubenswrapper[4756]: I0318 14:32:00.429627 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mndq5\" (UniqueName: \"kubernetes.io/projected/50619664-2536-4ae7-b393-863d44c0e69c-kube-api-access-mndq5\") pod \"auto-csr-approver-29564072-n29j8\" (UID: \"50619664-2536-4ae7-b393-863d44c0e69c\") " pod="openshift-infra/auto-csr-approver-29564072-n29j8" Mar 18 14:32:00 crc kubenswrapper[4756]: I0318 14:32:00.523998 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564072-n29j8" Mar 18 14:32:01 crc kubenswrapper[4756]: I0318 14:32:01.346623 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564072-n29j8"] Mar 18 14:32:01 crc kubenswrapper[4756]: W0318 14:32:01.349766 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50619664_2536_4ae7_b393_863d44c0e69c.slice/crio-c9532f311bf58d696932cf5fc076e44c164574ba26ec0bada4a90c434531f197 WatchSource:0}: Error finding container c9532f311bf58d696932cf5fc076e44c164574ba26ec0bada4a90c434531f197: Status 404 returned error can't find the container with id c9532f311bf58d696932cf5fc076e44c164574ba26ec0bada4a90c434531f197 Mar 18 14:32:02 crc kubenswrapper[4756]: I0318 14:32:02.047179 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564072-n29j8" event={"ID":"50619664-2536-4ae7-b393-863d44c0e69c","Type":"ContainerStarted","Data":"c9532f311bf58d696932cf5fc076e44c164574ba26ec0bada4a90c434531f197"} Mar 18 14:32:03 crc kubenswrapper[4756]: I0318 14:32:03.316642 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:32:03 crc kubenswrapper[4756]: E0318 14:32:03.317795 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:32:07 crc kubenswrapper[4756]: I0318 14:32:07.110645 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564072-n29j8" event={"ID":"50619664-2536-4ae7-b393-863d44c0e69c","Type":"ContainerStarted","Data":"ae1ea28a89ba42c44eb60c7c1996e89cf44c02233749c26618fe767a6a8591d0"} Mar 18 14:32:07 crc kubenswrapper[4756]: I0318 14:32:07.137528 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564072-n29j8" podStartSLOduration=1.85803033 podStartE2EDuration="7.137507113s" podCreationTimestamp="2026-03-18 14:32:00 +0000 UTC" firstStartedPulling="2026-03-18 14:32:01.352061024 +0000 UTC m=+1922.666479039" lastFinishedPulling="2026-03-18 14:32:06.631537837 +0000 UTC m=+1927.945955822" observedRunningTime="2026-03-18 14:32:07.131889941 +0000 UTC m=+1928.446307966" watchObservedRunningTime="2026-03-18 14:32:07.137507113 +0000 UTC m=+1928.451925098" Mar 18 14:32:08 crc kubenswrapper[4756]: I0318 14:32:08.134720 4756 generic.go:334] "Generic (PLEG): container finished" podID="50619664-2536-4ae7-b393-863d44c0e69c" containerID="ae1ea28a89ba42c44eb60c7c1996e89cf44c02233749c26618fe767a6a8591d0" exitCode=0 Mar 18 14:32:08 crc kubenswrapper[4756]: I0318 14:32:08.135537 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564072-n29j8" event={"ID":"50619664-2536-4ae7-b393-863d44c0e69c","Type":"ContainerDied","Data":"ae1ea28a89ba42c44eb60c7c1996e89cf44c02233749c26618fe767a6a8591d0"} Mar 18 14:32:09 crc kubenswrapper[4756]: I0318 14:32:09.901913 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564072-n29j8" Mar 18 14:32:10 crc kubenswrapper[4756]: I0318 14:32:10.003322 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mndq5\" (UniqueName: \"kubernetes.io/projected/50619664-2536-4ae7-b393-863d44c0e69c-kube-api-access-mndq5\") pod \"50619664-2536-4ae7-b393-863d44c0e69c\" (UID: \"50619664-2536-4ae7-b393-863d44c0e69c\") " Mar 18 14:32:10 crc kubenswrapper[4756]: I0318 14:32:10.012346 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50619664-2536-4ae7-b393-863d44c0e69c-kube-api-access-mndq5" (OuterVolumeSpecName: "kube-api-access-mndq5") pod "50619664-2536-4ae7-b393-863d44c0e69c" (UID: "50619664-2536-4ae7-b393-863d44c0e69c"). InnerVolumeSpecName "kube-api-access-mndq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:32:10 crc kubenswrapper[4756]: I0318 14:32:10.106047 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mndq5\" (UniqueName: \"kubernetes.io/projected/50619664-2536-4ae7-b393-863d44c0e69c-kube-api-access-mndq5\") on node \"crc\" DevicePath \"\"" Mar 18 14:32:10 crc kubenswrapper[4756]: I0318 14:32:10.174422 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564072-n29j8" event={"ID":"50619664-2536-4ae7-b393-863d44c0e69c","Type":"ContainerDied","Data":"c9532f311bf58d696932cf5fc076e44c164574ba26ec0bada4a90c434531f197"} Mar 18 14:32:10 crc kubenswrapper[4756]: I0318 14:32:10.174478 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9532f311bf58d696932cf5fc076e44c164574ba26ec0bada4a90c434531f197" Mar 18 14:32:10 crc kubenswrapper[4756]: I0318 14:32:10.174555 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564072-n29j8" Mar 18 14:32:10 crc kubenswrapper[4756]: I0318 14:32:10.218819 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564066-spbj4"] Mar 18 14:32:10 crc kubenswrapper[4756]: I0318 14:32:10.233387 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564066-spbj4"] Mar 18 14:32:11 crc kubenswrapper[4756]: I0318 14:32:11.329777 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e78e70b6-ea00-49f6-b5d9-8695cffcad06" path="/var/lib/kubelet/pods/e78e70b6-ea00-49f6-b5d9-8695cffcad06/volumes" Mar 18 14:32:15 crc kubenswrapper[4756]: I0318 14:32:15.054934 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-lvdn6"] Mar 18 14:32:15 crc kubenswrapper[4756]: I0318 14:32:15.068806 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-lvdn6"] Mar 18 14:32:15 crc kubenswrapper[4756]: I0318 14:32:15.091339 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ghbmc"] Mar 18 14:32:15 crc kubenswrapper[4756]: I0318 14:32:15.094032 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fqmsw"] Mar 18 14:32:15 crc kubenswrapper[4756]: I0318 14:32:15.106256 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fqmsw"] Mar 18 14:32:15 crc kubenswrapper[4756]: I0318 14:32:15.113366 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ghbmc"] Mar 18 14:32:15 crc kubenswrapper[4756]: I0318 14:32:15.334161 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cac161e-ec74-4621-801b-ad39634336d0" path="/var/lib/kubelet/pods/0cac161e-ec74-4621-801b-ad39634336d0/volumes" Mar 18 14:32:15 crc kubenswrapper[4756]: I0318 14:32:15.335164 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b4eda4-6808-4165-8b7c-e24dc046467c" path="/var/lib/kubelet/pods/18b4eda4-6808-4165-8b7c-e24dc046467c/volumes" Mar 18 14:32:15 crc kubenswrapper[4756]: I0318 14:32:15.335840 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d52c352-cfd7-4679-912b-11f753c7831f" path="/var/lib/kubelet/pods/5d52c352-cfd7-4679-912b-11f753c7831f/volumes" Mar 18 14:32:18 crc kubenswrapper[4756]: I0318 14:32:18.316277 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:32:18 crc kubenswrapper[4756]: E0318 14:32:18.316789 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:32:27 crc kubenswrapper[4756]: I0318 14:32:27.028978 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-rpx8m"] Mar 18 14:32:27 crc kubenswrapper[4756]: I0318 14:32:27.038262 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-rpx8m"] Mar 18 14:32:27 crc kubenswrapper[4756]: I0318 14:32:27.336937 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e" path="/var/lib/kubelet/pods/9e2cc691-8e4a-4fb1-ade1-ba20a9f5e56e/volumes" Mar 18 14:32:31 crc kubenswrapper[4756]: I0318 14:32:31.316264 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:32:31 crc kubenswrapper[4756]: E0318 14:32:31.317015 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:32:31 crc kubenswrapper[4756]: I0318 14:32:31.410179 4756 scope.go:117] "RemoveContainer" containerID="68fa0a2f1098d2b8cc780fc58f60074582e008246aa9d6f7ccf5b94dfc2b5a57" Mar 18 14:32:31 crc kubenswrapper[4756]: I0318 14:32:31.454511 4756 scope.go:117] "RemoveContainer" containerID="d6ebd677533e1c0881c300219bdb060483ab11e6568210358df4233c1599fac8" Mar 18 14:32:31 crc kubenswrapper[4756]: I0318 14:32:31.534944 4756 scope.go:117] "RemoveContainer" containerID="6d37e6b93023940efa98e04c99af04d9028ddaa782ed1c41577a4d170a3b9b6b" Mar 18 14:32:31 crc kubenswrapper[4756]: I0318 14:32:31.573079 4756 scope.go:117] "RemoveContainer" containerID="d2d1fe77d93c754e6515368dd66601e0dd41859e0f19118ebdbf4cde6a187c19" Mar 18 14:32:31 crc kubenswrapper[4756]: I0318 14:32:31.619647 4756 scope.go:117] "RemoveContainer" containerID="7abf039d8bbcd752c853e9aa21a26341344b0b3ac137b315a4f245db38214005" Mar 18 14:32:31 crc kubenswrapper[4756]: I0318 14:32:31.662745 4756 scope.go:117] "RemoveContainer" containerID="b02d67eec55be30003a0e76f6969294b0e486d32472e917f8959eec771490a55" Mar 18 14:32:42 crc kubenswrapper[4756]: I0318 14:32:42.316336 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:32:42 crc kubenswrapper[4756]: E0318 14:32:42.317717 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:32:50 crc kubenswrapper[4756]: I0318 14:32:50.611785 4756 generic.go:334] "Generic (PLEG): container finished" podID="437705be-47a9-4902-9b5d-c8f293a3985e" containerID="244aaa3aebd49289006153985d4e1f37ea434c676107ad4ed0395f76452f2175" exitCode=0 Mar 18 14:32:50 crc kubenswrapper[4756]: I0318 14:32:50.611899 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" event={"ID":"437705be-47a9-4902-9b5d-c8f293a3985e","Type":"ContainerDied","Data":"244aaa3aebd49289006153985d4e1f37ea434c676107ad4ed0395f76452f2175"} Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.320925 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.377636 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwswv\" (UniqueName: \"kubernetes.io/projected/437705be-47a9-4902-9b5d-c8f293a3985e-kube-api-access-qwswv\") pod \"437705be-47a9-4902-9b5d-c8f293a3985e\" (UID: \"437705be-47a9-4902-9b5d-c8f293a3985e\") " Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.377812 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/437705be-47a9-4902-9b5d-c8f293a3985e-ssh-key-openstack-edpm-ipam\") pod \"437705be-47a9-4902-9b5d-c8f293a3985e\" (UID: \"437705be-47a9-4902-9b5d-c8f293a3985e\") " Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.377891 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/437705be-47a9-4902-9b5d-c8f293a3985e-inventory\") pod \"437705be-47a9-4902-9b5d-c8f293a3985e\" (UID: \"437705be-47a9-4902-9b5d-c8f293a3985e\") " Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.384400 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/437705be-47a9-4902-9b5d-c8f293a3985e-kube-api-access-qwswv" (OuterVolumeSpecName: "kube-api-access-qwswv") pod "437705be-47a9-4902-9b5d-c8f293a3985e" (UID: "437705be-47a9-4902-9b5d-c8f293a3985e"). InnerVolumeSpecName "kube-api-access-qwswv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.410908 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437705be-47a9-4902-9b5d-c8f293a3985e-inventory" (OuterVolumeSpecName: "inventory") pod "437705be-47a9-4902-9b5d-c8f293a3985e" (UID: "437705be-47a9-4902-9b5d-c8f293a3985e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.415490 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437705be-47a9-4902-9b5d-c8f293a3985e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "437705be-47a9-4902-9b5d-c8f293a3985e" (UID: "437705be-47a9-4902-9b5d-c8f293a3985e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.480638 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/437705be-47a9-4902-9b5d-c8f293a3985e-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.480674 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwswv\" (UniqueName: \"kubernetes.io/projected/437705be-47a9-4902-9b5d-c8f293a3985e-kube-api-access-qwswv\") on node \"crc\" DevicePath \"\"" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.480686 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/437705be-47a9-4902-9b5d-c8f293a3985e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.650194 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" event={"ID":"437705be-47a9-4902-9b5d-c8f293a3985e","Type":"ContainerDied","Data":"8e88b11300122b0ace4ef4dbcf76c86787535daf8b1dda1cd55a5b90f6a3e8c7"} Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.650231 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e88b11300122b0ace4ef4dbcf76c86787535daf8b1dda1cd55a5b90f6a3e8c7" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.650288 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.728553 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh"] Mar 18 14:32:52 crc kubenswrapper[4756]: E0318 14:32:52.728977 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50619664-2536-4ae7-b393-863d44c0e69c" containerName="oc" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.728988 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="50619664-2536-4ae7-b393-863d44c0e69c" containerName="oc" Mar 18 14:32:52 crc kubenswrapper[4756]: E0318 14:32:52.729018 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437705be-47a9-4902-9b5d-c8f293a3985e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.729025 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="437705be-47a9-4902-9b5d-c8f293a3985e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.729259 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="50619664-2536-4ae7-b393-863d44c0e69c" containerName="oc" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.729276 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="437705be-47a9-4902-9b5d-c8f293a3985e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.730004 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.732145 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.732278 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.732374 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.732736 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.742837 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh"] Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.787514 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6twg\" (UniqueName: \"kubernetes.io/projected/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-kube-api-access-d6twg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-spvxh\" (UID: \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.787741 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-spvxh\" (UID: \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.787896 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-spvxh\" (UID: \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.890489 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-spvxh\" (UID: \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.890596 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-spvxh\" (UID: \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.890847 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6twg\" (UniqueName: \"kubernetes.io/projected/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-kube-api-access-d6twg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-spvxh\" (UID: \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.894353 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-spvxh\" (UID: \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.894727 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-spvxh\" (UID: \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" Mar 18 14:32:52 crc kubenswrapper[4756]: I0318 14:32:52.907216 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6twg\" (UniqueName: \"kubernetes.io/projected/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-kube-api-access-d6twg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-spvxh\" (UID: \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" Mar 18 14:32:53 crc kubenswrapper[4756]: I0318 14:32:53.050887 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" Mar 18 14:32:53 crc kubenswrapper[4756]: I0318 14:32:53.327939 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:32:53 crc kubenswrapper[4756]: E0318 14:32:53.328812 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:32:53 crc kubenswrapper[4756]: I0318 14:32:53.573733 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh"] Mar 18 14:32:53 crc kubenswrapper[4756]: I0318 14:32:53.662407 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" event={"ID":"9422548d-30ba-46b8-a1b0-3a6dfa64bd70","Type":"ContainerStarted","Data":"271b4bf7d741d87cecb4a52407f06c2dbfbe353c8dbc0e25101b2364c99f2730"} Mar 18 14:32:54 crc kubenswrapper[4756]: I0318 14:32:54.676224 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" event={"ID":"9422548d-30ba-46b8-a1b0-3a6dfa64bd70","Type":"ContainerStarted","Data":"b0f111cd246f6ba0050e39d9134d1d795c8cb4dadd9833911603e49e3399f381"} Mar 18 14:32:54 crc kubenswrapper[4756]: I0318 14:32:54.700087 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" podStartSLOduration=2.26929875 podStartE2EDuration="2.700057991s" podCreationTimestamp="2026-03-18 14:32:52 +0000 UTC" firstStartedPulling="2026-03-18 14:32:53.578009927 +0000 UTC m=+1974.892427902" lastFinishedPulling="2026-03-18 14:32:54.008769148 +0000 UTC m=+1975.323187143" observedRunningTime="2026-03-18 14:32:54.691455399 +0000 UTC m=+1976.005873404" watchObservedRunningTime="2026-03-18 14:32:54.700057991 +0000 UTC m=+1976.014475976" Mar 18 14:32:59 crc kubenswrapper[4756]: I0318 14:32:59.725812 4756 generic.go:334] "Generic (PLEG): container finished" podID="9422548d-30ba-46b8-a1b0-3a6dfa64bd70" containerID="b0f111cd246f6ba0050e39d9134d1d795c8cb4dadd9833911603e49e3399f381" exitCode=0 Mar 18 14:32:59 crc kubenswrapper[4756]: I0318 14:32:59.725952 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" event={"ID":"9422548d-30ba-46b8-a1b0-3a6dfa64bd70","Type":"ContainerDied","Data":"b0f111cd246f6ba0050e39d9134d1d795c8cb4dadd9833911603e49e3399f381"} Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.408608 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.601104 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-inventory\") pod \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\" (UID: \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\") " Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.601668 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-ssh-key-openstack-edpm-ipam\") pod \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\" (UID: \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\") " Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.601750 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6twg\" (UniqueName: \"kubernetes.io/projected/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-kube-api-access-d6twg\") pod \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\" (UID: \"9422548d-30ba-46b8-a1b0-3a6dfa64bd70\") " Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.608386 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-kube-api-access-d6twg" (OuterVolumeSpecName: "kube-api-access-d6twg") pod "9422548d-30ba-46b8-a1b0-3a6dfa64bd70" (UID: "9422548d-30ba-46b8-a1b0-3a6dfa64bd70"). InnerVolumeSpecName "kube-api-access-d6twg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.674432 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9422548d-30ba-46b8-a1b0-3a6dfa64bd70" (UID: "9422548d-30ba-46b8-a1b0-3a6dfa64bd70"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.674731 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-inventory" (OuterVolumeSpecName: "inventory") pod "9422548d-30ba-46b8-a1b0-3a6dfa64bd70" (UID: "9422548d-30ba-46b8-a1b0-3a6dfa64bd70"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.706746 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6twg\" (UniqueName: \"kubernetes.io/projected/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-kube-api-access-d6twg\") on node \"crc\" DevicePath \"\"" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.706801 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.706815 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9422548d-30ba-46b8-a1b0-3a6dfa64bd70-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.791735 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" event={"ID":"9422548d-30ba-46b8-a1b0-3a6dfa64bd70","Type":"ContainerDied","Data":"271b4bf7d741d87cecb4a52407f06c2dbfbe353c8dbc0e25101b2364c99f2730"} Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.792029 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="271b4bf7d741d87cecb4a52407f06c2dbfbe353c8dbc0e25101b2364c99f2730" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.792173 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-spvxh" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.840469 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh"] Mar 18 14:33:01 crc kubenswrapper[4756]: E0318 14:33:01.840919 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9422548d-30ba-46b8-a1b0-3a6dfa64bd70" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.840941 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9422548d-30ba-46b8-a1b0-3a6dfa64bd70" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.841157 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9422548d-30ba-46b8-a1b0-3a6dfa64bd70" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.841894 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.844792 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.845161 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.845347 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.845498 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:33:01 crc kubenswrapper[4756]: I0318 14:33:01.851856 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh"] Mar 18 14:33:02 crc kubenswrapper[4756]: I0318 14:33:02.012313 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52e4cd39-a658-42c6-b1d3-2f7a144688f1-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znvdh\" (UID: \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" Mar 18 14:33:02 crc kubenswrapper[4756]: I0318 14:33:02.012434 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzbc\" (UniqueName: \"kubernetes.io/projected/52e4cd39-a658-42c6-b1d3-2f7a144688f1-kube-api-access-pnzbc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znvdh\" (UID: \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" Mar 18 14:33:02 crc kubenswrapper[4756]: I0318 14:33:02.012543 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52e4cd39-a658-42c6-b1d3-2f7a144688f1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znvdh\" (UID: \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" Mar 18 14:33:02 crc kubenswrapper[4756]: I0318 14:33:02.114278 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52e4cd39-a658-42c6-b1d3-2f7a144688f1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znvdh\" (UID: \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" Mar 18 14:33:02 crc kubenswrapper[4756]: I0318 14:33:02.114369 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52e4cd39-a658-42c6-b1d3-2f7a144688f1-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znvdh\" (UID: \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" Mar 18 14:33:02 crc kubenswrapper[4756]: I0318 14:33:02.114437 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzbc\" (UniqueName: \"kubernetes.io/projected/52e4cd39-a658-42c6-b1d3-2f7a144688f1-kube-api-access-pnzbc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znvdh\" (UID: \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" Mar 18 14:33:02 crc kubenswrapper[4756]: I0318 14:33:02.118644 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52e4cd39-a658-42c6-b1d3-2f7a144688f1-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znvdh\" (UID: \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" Mar 18 14:33:02 crc kubenswrapper[4756]: I0318 14:33:02.120933 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52e4cd39-a658-42c6-b1d3-2f7a144688f1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znvdh\" (UID: \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" Mar 18 14:33:02 crc kubenswrapper[4756]: I0318 14:33:02.134262 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzbc\" (UniqueName: \"kubernetes.io/projected/52e4cd39-a658-42c6-b1d3-2f7a144688f1-kube-api-access-pnzbc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-znvdh\" (UID: \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" Mar 18 14:33:02 crc kubenswrapper[4756]: I0318 14:33:02.160898 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" Mar 18 14:33:02 crc kubenswrapper[4756]: I0318 14:33:02.728821 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh"] Mar 18 14:33:02 crc kubenswrapper[4756]: I0318 14:33:02.806522 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" event={"ID":"52e4cd39-a658-42c6-b1d3-2f7a144688f1","Type":"ContainerStarted","Data":"6ba6db703173d2a370854d5c30214ec22d2eecce661db513ff6de6dc80c90550"} Mar 18 14:33:03 crc kubenswrapper[4756]: I0318 14:33:03.819596 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" event={"ID":"52e4cd39-a658-42c6-b1d3-2f7a144688f1","Type":"ContainerStarted","Data":"99e8f1afb0f36f056808817dd2f918aa071ed03488f0a2ab7671fb7e5e8a8491"} Mar 18 14:33:03 crc kubenswrapper[4756]: I0318 14:33:03.849431 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" podStartSLOduration=2.453271849 podStartE2EDuration="2.849407084s" podCreationTimestamp="2026-03-18 14:33:01 +0000 UTC" firstStartedPulling="2026-03-18 14:33:02.73291106 +0000 UTC m=+1984.047329035" lastFinishedPulling="2026-03-18 14:33:03.129010944 +0000 UTC m=+1984.443464270" observedRunningTime="2026-03-18 14:33:03.836953549 +0000 UTC m=+1985.151371534" watchObservedRunningTime="2026-03-18 14:33:03.849407084 +0000 UTC m=+1985.163825069" Mar 18 14:33:05 crc kubenswrapper[4756]: I0318 14:33:05.315947 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:33:05 crc kubenswrapper[4756]: E0318 14:33:05.316720 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.068992 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9614-account-create-update-4784w"] Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.081974 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-f7zgs"] Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.093200 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ac52-account-create-update-kxhc4"] Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.108269 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-98s5z"] Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.117896 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9614-account-create-update-4784w"] Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.123795 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1f8d-account-create-update-47b8t"] Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.131458 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-f9qhz"] Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.139051 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-ac52-account-create-update-kxhc4"] Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.146082 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-98s5z"] Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.153007 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-f7zgs"] Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.160012 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-f9qhz"] Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.166710 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1f8d-account-create-update-47b8t"] Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.331831 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1190e7a3-cde5-467e-adbc-b20c6f7823d5" path="/var/lib/kubelet/pods/1190e7a3-cde5-467e-adbc-b20c6f7823d5/volumes" Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.333415 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1ea486-ed86-4ae5-9374-3538e9d1e4fc" path="/var/lib/kubelet/pods/3d1ea486-ed86-4ae5-9374-3538e9d1e4fc/volumes" Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.334184 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aabe4a2-969f-44f7-aa29-60db47845f80" path="/var/lib/kubelet/pods/5aabe4a2-969f-44f7-aa29-60db47845f80/volumes" Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.334974 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c62dc13-d167-48da-909e-eff8ca5852f4" path="/var/lib/kubelet/pods/8c62dc13-d167-48da-909e-eff8ca5852f4/volumes" Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.336800 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="928231e9-14ab-4f87-85c0-37a372d3ed9d" path="/var/lib/kubelet/pods/928231e9-14ab-4f87-85c0-37a372d3ed9d/volumes" Mar 18 14:33:09 crc kubenswrapper[4756]: I0318 14:33:09.338204 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc4adb0e-915e-4b69-ad6f-2f510f53e2e8" path="/var/lib/kubelet/pods/dc4adb0e-915e-4b69-ad6f-2f510f53e2e8/volumes" Mar 18 14:33:19 crc kubenswrapper[4756]: I0318 14:33:19.321540 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:33:19 crc kubenswrapper[4756]: E0318 14:33:19.322319 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:33:30 crc kubenswrapper[4756]: I0318 14:33:30.316922 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:33:30 crc kubenswrapper[4756]: E0318 14:33:30.318589 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:33:31 crc kubenswrapper[4756]: I0318 14:33:31.853174 4756 scope.go:117] "RemoveContainer" containerID="ff3106128bda59043bbdbf5e0778ab558813dd81e0e44b3f12da9fb2361dea8d" Mar 18 14:33:31 crc kubenswrapper[4756]: I0318 14:33:31.891647 4756 scope.go:117] "RemoveContainer" containerID="7e98aefca504c4b58fa01acbc93d1b2ff2866bf22a5a2b4b1e64c5cd365a6f5f" Mar 18 14:33:31 crc kubenswrapper[4756]: I0318 14:33:31.976723 4756 scope.go:117] "RemoveContainer" containerID="02c4177cfb2feb7265bb8b4f7d30aea3746c3361258770026ad9235036f215ac" Mar 18 14:33:32 crc kubenswrapper[4756]: I0318 14:33:32.024667 4756 scope.go:117] "RemoveContainer" containerID="1b46a550f07d89c90c84d54479874f5949ef2f0e0cb0872c60f476d80993bef2" Mar 18 14:33:32 crc kubenswrapper[4756]: I0318 14:33:32.069379 4756 scope.go:117] "RemoveContainer" containerID="d38503af1d576fa6cca741f50a2e53337bdf196bf6690d4b3414bc8615900fd5" Mar 18 14:33:32 crc kubenswrapper[4756]: I0318 14:33:32.117453 4756 scope.go:117] "RemoveContainer" containerID="17e85402eccc18bc2e601485967e35c113bbd4370f6b7df22124dd9024330ee9" Mar 18 14:33:38 crc kubenswrapper[4756]: I0318 14:33:38.254767 4756 generic.go:334] "Generic (PLEG): container finished" podID="52e4cd39-a658-42c6-b1d3-2f7a144688f1" containerID="99e8f1afb0f36f056808817dd2f918aa071ed03488f0a2ab7671fb7e5e8a8491" exitCode=0 Mar 18 14:33:38 crc kubenswrapper[4756]: I0318 14:33:38.255596 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" event={"ID":"52e4cd39-a658-42c6-b1d3-2f7a144688f1","Type":"ContainerDied","Data":"99e8f1afb0f36f056808817dd2f918aa071ed03488f0a2ab7671fb7e5e8a8491"} Mar 18 14:33:39 crc kubenswrapper[4756]: I0318 14:33:39.781444 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" Mar 18 14:33:39 crc kubenswrapper[4756]: I0318 14:33:39.800114 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52e4cd39-a658-42c6-b1d3-2f7a144688f1-ssh-key-openstack-edpm-ipam\") pod \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\" (UID: \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\") " Mar 18 14:33:39 crc kubenswrapper[4756]: I0318 14:33:39.800227 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnzbc\" (UniqueName: \"kubernetes.io/projected/52e4cd39-a658-42c6-b1d3-2f7a144688f1-kube-api-access-pnzbc\") pod \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\" (UID: \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\") " Mar 18 14:33:39 crc kubenswrapper[4756]: I0318 14:33:39.800510 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52e4cd39-a658-42c6-b1d3-2f7a144688f1-inventory\") pod \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\" (UID: \"52e4cd39-a658-42c6-b1d3-2f7a144688f1\") " Mar 18 14:33:39 crc kubenswrapper[4756]: I0318 14:33:39.807959 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e4cd39-a658-42c6-b1d3-2f7a144688f1-kube-api-access-pnzbc" (OuterVolumeSpecName: "kube-api-access-pnzbc") pod "52e4cd39-a658-42c6-b1d3-2f7a144688f1" (UID: "52e4cd39-a658-42c6-b1d3-2f7a144688f1"). InnerVolumeSpecName "kube-api-access-pnzbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:33:39 crc kubenswrapper[4756]: I0318 14:33:39.877269 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e4cd39-a658-42c6-b1d3-2f7a144688f1-inventory" (OuterVolumeSpecName: "inventory") pod "52e4cd39-a658-42c6-b1d3-2f7a144688f1" (UID: "52e4cd39-a658-42c6-b1d3-2f7a144688f1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:33:39 crc kubenswrapper[4756]: I0318 14:33:39.884045 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e4cd39-a658-42c6-b1d3-2f7a144688f1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "52e4cd39-a658-42c6-b1d3-2f7a144688f1" (UID: "52e4cd39-a658-42c6-b1d3-2f7a144688f1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:33:39 crc kubenswrapper[4756]: I0318 14:33:39.904006 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/52e4cd39-a658-42c6-b1d3-2f7a144688f1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:33:39 crc kubenswrapper[4756]: I0318 14:33:39.904051 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnzbc\" (UniqueName: \"kubernetes.io/projected/52e4cd39-a658-42c6-b1d3-2f7a144688f1-kube-api-access-pnzbc\") on node \"crc\" DevicePath \"\"" Mar 18 14:33:39 crc kubenswrapper[4756]: I0318 14:33:39.904070 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52e4cd39-a658-42c6-b1d3-2f7a144688f1-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.282556 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" event={"ID":"52e4cd39-a658-42c6-b1d3-2f7a144688f1","Type":"ContainerDied","Data":"6ba6db703173d2a370854d5c30214ec22d2eecce661db513ff6de6dc80c90550"} Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.282882 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ba6db703173d2a370854d5c30214ec22d2eecce661db513ff6de6dc80c90550" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.282620 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-znvdh" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.391987 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd"] Mar 18 14:33:40 crc kubenswrapper[4756]: E0318 14:33:40.392575 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e4cd39-a658-42c6-b1d3-2f7a144688f1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.392604 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e4cd39-a658-42c6-b1d3-2f7a144688f1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.392885 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e4cd39-a658-42c6-b1d3-2f7a144688f1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.393840 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.396749 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.396749 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.396848 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.397394 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.403041 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd"] Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.514495 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd7799ac-8443-42eb-ab9e-47e654eb8dca-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd\" (UID: \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.515080 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7799ac-8443-42eb-ab9e-47e654eb8dca-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd\" (UID: \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.515249 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crpkd\" (UniqueName: \"kubernetes.io/projected/dd7799ac-8443-42eb-ab9e-47e654eb8dca-kube-api-access-crpkd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd\" (UID: \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.617507 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd7799ac-8443-42eb-ab9e-47e654eb8dca-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd\" (UID: \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.617920 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7799ac-8443-42eb-ab9e-47e654eb8dca-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd\" (UID: \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.618020 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crpkd\" (UniqueName: \"kubernetes.io/projected/dd7799ac-8443-42eb-ab9e-47e654eb8dca-kube-api-access-crpkd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd\" (UID: \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.623331 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd7799ac-8443-42eb-ab9e-47e654eb8dca-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd\" (UID: \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.623477 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7799ac-8443-42eb-ab9e-47e654eb8dca-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd\" (UID: \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.637113 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crpkd\" (UniqueName: \"kubernetes.io/projected/dd7799ac-8443-42eb-ab9e-47e654eb8dca-kube-api-access-crpkd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd\" (UID: \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" Mar 18 14:33:40 crc kubenswrapper[4756]: I0318 14:33:40.729247 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" Mar 18 14:33:41 crc kubenswrapper[4756]: I0318 14:33:41.050056 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k4zq4"] Mar 18 14:33:41 crc kubenswrapper[4756]: I0318 14:33:41.064166 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k4zq4"] Mar 18 14:33:41 crc kubenswrapper[4756]: I0318 14:33:41.329969 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73e64ca2-cc29-4a74-a970-b127ec0380f1" path="/var/lib/kubelet/pods/73e64ca2-cc29-4a74-a970-b127ec0380f1/volumes" Mar 18 14:33:41 crc kubenswrapper[4756]: W0318 14:33:41.364658 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd7799ac_8443_42eb_ab9e_47e654eb8dca.slice/crio-eb46d405a872a170bc0e1efa8301e80a31d785536e03a943233b337926cc3959 WatchSource:0}: Error finding container eb46d405a872a170bc0e1efa8301e80a31d785536e03a943233b337926cc3959: Status 404 returned error can't find the container with id eb46d405a872a170bc0e1efa8301e80a31d785536e03a943233b337926cc3959 Mar 18 14:33:41 crc kubenswrapper[4756]: I0318 14:33:41.366011 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd"] Mar 18 14:33:42 crc kubenswrapper[4756]: I0318 14:33:42.304915 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" event={"ID":"dd7799ac-8443-42eb-ab9e-47e654eb8dca","Type":"ContainerStarted","Data":"4803deeac4178e32dd7639108174771565206d26882ff09424ed6a9994047159"} Mar 18 14:33:42 crc kubenswrapper[4756]: I0318 14:33:42.305303 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" event={"ID":"dd7799ac-8443-42eb-ab9e-47e654eb8dca","Type":"ContainerStarted","Data":"eb46d405a872a170bc0e1efa8301e80a31d785536e03a943233b337926cc3959"} Mar 18 14:33:42 crc kubenswrapper[4756]: I0318 14:33:42.323474 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" podStartSLOduration=1.727587979 podStartE2EDuration="2.323452657s" podCreationTimestamp="2026-03-18 14:33:40 +0000 UTC" firstStartedPulling="2026-03-18 14:33:41.370636762 +0000 UTC m=+2022.685054737" lastFinishedPulling="2026-03-18 14:33:41.96650145 +0000 UTC m=+2023.280919415" observedRunningTime="2026-03-18 14:33:42.322235924 +0000 UTC m=+2023.636653899" watchObservedRunningTime="2026-03-18 14:33:42.323452657 +0000 UTC m=+2023.637870632" Mar 18 14:33:43 crc kubenswrapper[4756]: I0318 14:33:43.315243 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:33:43 crc kubenswrapper[4756]: E0318 14:33:43.315799 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:33:54 crc kubenswrapper[4756]: I0318 14:33:54.316060 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:33:54 crc kubenswrapper[4756]: E0318 14:33:54.317108 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:34:00 crc kubenswrapper[4756]: I0318 14:34:00.171225 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564074-vt6tr"] Mar 18 14:34:00 crc kubenswrapper[4756]: I0318 14:34:00.184449 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564074-vt6tr" Mar 18 14:34:00 crc kubenswrapper[4756]: I0318 14:34:00.201974 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:34:00 crc kubenswrapper[4756]: I0318 14:34:00.202598 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:34:00 crc kubenswrapper[4756]: I0318 14:34:00.202934 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:34:00 crc kubenswrapper[4756]: I0318 14:34:00.208880 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564074-vt6tr"] Mar 18 14:34:00 crc kubenswrapper[4756]: I0318 14:34:00.270330 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md6b4\" (UniqueName: \"kubernetes.io/projected/363e90c7-2daa-4777-bb42-3d82405bedff-kube-api-access-md6b4\") pod \"auto-csr-approver-29564074-vt6tr\" (UID: \"363e90c7-2daa-4777-bb42-3d82405bedff\") " pod="openshift-infra/auto-csr-approver-29564074-vt6tr" Mar 18 14:34:00 crc kubenswrapper[4756]: I0318 14:34:00.373883 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md6b4\" (UniqueName: \"kubernetes.io/projected/363e90c7-2daa-4777-bb42-3d82405bedff-kube-api-access-md6b4\") pod \"auto-csr-approver-29564074-vt6tr\" (UID: \"363e90c7-2daa-4777-bb42-3d82405bedff\") " pod="openshift-infra/auto-csr-approver-29564074-vt6tr" Mar 18 14:34:00 crc kubenswrapper[4756]: I0318 14:34:00.404057 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md6b4\" (UniqueName: \"kubernetes.io/projected/363e90c7-2daa-4777-bb42-3d82405bedff-kube-api-access-md6b4\") pod \"auto-csr-approver-29564074-vt6tr\" (UID: \"363e90c7-2daa-4777-bb42-3d82405bedff\") " pod="openshift-infra/auto-csr-approver-29564074-vt6tr" Mar 18 14:34:00 crc kubenswrapper[4756]: I0318 14:34:00.514746 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564074-vt6tr" Mar 18 14:34:01 crc kubenswrapper[4756]: I0318 14:34:01.004634 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564074-vt6tr"] Mar 18 14:34:01 crc kubenswrapper[4756]: I0318 14:34:01.507217 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564074-vt6tr" event={"ID":"363e90c7-2daa-4777-bb42-3d82405bedff","Type":"ContainerStarted","Data":"4b860feb7d7c443bb5493df64dc2d4957287542659742e6e888e8371fc2fa56d"} Mar 18 14:34:03 crc kubenswrapper[4756]: I0318 14:34:03.528512 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564074-vt6tr" event={"ID":"363e90c7-2daa-4777-bb42-3d82405bedff","Type":"ContainerStarted","Data":"95b204db1c49842f6030ea424a910543abb9fcb7b6bf51ede615bac1e42ec729"} Mar 18 14:34:03 crc kubenswrapper[4756]: I0318 14:34:03.551173 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564074-vt6tr" podStartSLOduration=1.3709314209999999 podStartE2EDuration="3.551109323s" podCreationTimestamp="2026-03-18 14:34:00 +0000 UTC" firstStartedPulling="2026-03-18 14:34:01.015836064 +0000 UTC m=+2042.330254039" lastFinishedPulling="2026-03-18 14:34:03.196013976 +0000 UTC m=+2044.510431941" observedRunningTime="2026-03-18 14:34:03.545871732 +0000 UTC m=+2044.860289727" watchObservedRunningTime="2026-03-18 14:34:03.551109323 +0000 UTC m=+2044.865527308" Mar 18 14:34:04 crc kubenswrapper[4756]: I0318 14:34:04.540163 4756 generic.go:334] "Generic (PLEG): container finished" podID="363e90c7-2daa-4777-bb42-3d82405bedff" containerID="95b204db1c49842f6030ea424a910543abb9fcb7b6bf51ede615bac1e42ec729" exitCode=0 Mar 18 14:34:04 crc kubenswrapper[4756]: I0318 14:34:04.540217 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564074-vt6tr" event={"ID":"363e90c7-2daa-4777-bb42-3d82405bedff","Type":"ContainerDied","Data":"95b204db1c49842f6030ea424a910543abb9fcb7b6bf51ede615bac1e42ec729"} Mar 18 14:34:05 crc kubenswrapper[4756]: I0318 14:34:05.058794 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-t85xw"] Mar 18 14:34:05 crc kubenswrapper[4756]: I0318 14:34:05.072179 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-t85xw"] Mar 18 14:34:05 crc kubenswrapper[4756]: I0318 14:34:05.330034 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61189734-2130-4de6-aa34-c96879ff64de" path="/var/lib/kubelet/pods/61189734-2130-4de6-aa34-c96879ff64de/volumes" Mar 18 14:34:06 crc kubenswrapper[4756]: I0318 14:34:06.002156 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564074-vt6tr" Mar 18 14:34:06 crc kubenswrapper[4756]: I0318 14:34:06.049772 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b64xk"] Mar 18 14:34:06 crc kubenswrapper[4756]: I0318 14:34:06.060745 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b64xk"] Mar 18 14:34:06 crc kubenswrapper[4756]: I0318 14:34:06.099860 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md6b4\" (UniqueName: \"kubernetes.io/projected/363e90c7-2daa-4777-bb42-3d82405bedff-kube-api-access-md6b4\") pod \"363e90c7-2daa-4777-bb42-3d82405bedff\" (UID: \"363e90c7-2daa-4777-bb42-3d82405bedff\") " Mar 18 14:34:06 crc kubenswrapper[4756]: I0318 14:34:06.105522 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/363e90c7-2daa-4777-bb42-3d82405bedff-kube-api-access-md6b4" (OuterVolumeSpecName: "kube-api-access-md6b4") pod "363e90c7-2daa-4777-bb42-3d82405bedff" (UID: "363e90c7-2daa-4777-bb42-3d82405bedff"). InnerVolumeSpecName "kube-api-access-md6b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:34:06 crc kubenswrapper[4756]: I0318 14:34:06.203318 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md6b4\" (UniqueName: \"kubernetes.io/projected/363e90c7-2daa-4777-bb42-3d82405bedff-kube-api-access-md6b4\") on node \"crc\" DevicePath \"\"" Mar 18 14:34:06 crc kubenswrapper[4756]: I0318 14:34:06.561957 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564074-vt6tr" event={"ID":"363e90c7-2daa-4777-bb42-3d82405bedff","Type":"ContainerDied","Data":"4b860feb7d7c443bb5493df64dc2d4957287542659742e6e888e8371fc2fa56d"} Mar 18 14:34:06 crc kubenswrapper[4756]: I0318 14:34:06.561995 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b860feb7d7c443bb5493df64dc2d4957287542659742e6e888e8371fc2fa56d" Mar 18 14:34:06 crc kubenswrapper[4756]: I0318 14:34:06.562055 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564074-vt6tr" Mar 18 14:34:06 crc kubenswrapper[4756]: I0318 14:34:06.610088 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564068-4xkm4"] Mar 18 14:34:06 crc kubenswrapper[4756]: I0318 14:34:06.621553 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564068-4xkm4"] Mar 18 14:34:07 crc kubenswrapper[4756]: I0318 14:34:07.317329 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:34:07 crc kubenswrapper[4756]: E0318 14:34:07.317846 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:34:07 crc kubenswrapper[4756]: I0318 14:34:07.334861 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b5953e-a4c2-44bd-ae8c-74ded1ebba07" path="/var/lib/kubelet/pods/71b5953e-a4c2-44bd-ae8c-74ded1ebba07/volumes" Mar 18 14:34:07 crc kubenswrapper[4756]: I0318 14:34:07.336812 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8af5145-9647-4eb0-95f2-4e32d681fc9e" path="/var/lib/kubelet/pods/a8af5145-9647-4eb0-95f2-4e32d681fc9e/volumes" Mar 18 14:34:22 crc kubenswrapper[4756]: I0318 14:34:22.316844 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:34:22 crc kubenswrapper[4756]: E0318 14:34:22.318315 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:34:28 crc kubenswrapper[4756]: I0318 14:34:28.803909 4756 generic.go:334] "Generic (PLEG): container finished" podID="dd7799ac-8443-42eb-ab9e-47e654eb8dca" containerID="4803deeac4178e32dd7639108174771565206d26882ff09424ed6a9994047159" exitCode=0 Mar 18 14:34:28 crc kubenswrapper[4756]: I0318 14:34:28.804013 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" event={"ID":"dd7799ac-8443-42eb-ab9e-47e654eb8dca","Type":"ContainerDied","Data":"4803deeac4178e32dd7639108174771565206d26882ff09424ed6a9994047159"} Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.303028 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.369400 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd7799ac-8443-42eb-ab9e-47e654eb8dca-ssh-key-openstack-edpm-ipam\") pod \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\" (UID: \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\") " Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.369616 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crpkd\" (UniqueName: \"kubernetes.io/projected/dd7799ac-8443-42eb-ab9e-47e654eb8dca-kube-api-access-crpkd\") pod \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\" (UID: \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\") " Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.370480 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7799ac-8443-42eb-ab9e-47e654eb8dca-inventory\") pod \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\" (UID: \"dd7799ac-8443-42eb-ab9e-47e654eb8dca\") " Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.377005 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7799ac-8443-42eb-ab9e-47e654eb8dca-kube-api-access-crpkd" (OuterVolumeSpecName: "kube-api-access-crpkd") pod "dd7799ac-8443-42eb-ab9e-47e654eb8dca" (UID: "dd7799ac-8443-42eb-ab9e-47e654eb8dca"). InnerVolumeSpecName "kube-api-access-crpkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.401213 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7799ac-8443-42eb-ab9e-47e654eb8dca-inventory" (OuterVolumeSpecName: "inventory") pod "dd7799ac-8443-42eb-ab9e-47e654eb8dca" (UID: "dd7799ac-8443-42eb-ab9e-47e654eb8dca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.404720 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7799ac-8443-42eb-ab9e-47e654eb8dca-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dd7799ac-8443-42eb-ab9e-47e654eb8dca" (UID: "dd7799ac-8443-42eb-ab9e-47e654eb8dca"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.473197 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd7799ac-8443-42eb-ab9e-47e654eb8dca-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.473238 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crpkd\" (UniqueName: \"kubernetes.io/projected/dd7799ac-8443-42eb-ab9e-47e654eb8dca-kube-api-access-crpkd\") on node \"crc\" DevicePath \"\"" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.473258 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd7799ac-8443-42eb-ab9e-47e654eb8dca-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.833265 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" event={"ID":"dd7799ac-8443-42eb-ab9e-47e654eb8dca","Type":"ContainerDied","Data":"eb46d405a872a170bc0e1efa8301e80a31d785536e03a943233b337926cc3959"} Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.833627 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb46d405a872a170bc0e1efa8301e80a31d785536e03a943233b337926cc3959" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.833391 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.943104 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sh6cs"] Mar 18 14:34:30 crc kubenswrapper[4756]: E0318 14:34:30.943582 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363e90c7-2daa-4777-bb42-3d82405bedff" containerName="oc" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.943605 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="363e90c7-2daa-4777-bb42-3d82405bedff" containerName="oc" Mar 18 14:34:30 crc kubenswrapper[4756]: E0318 14:34:30.943649 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7799ac-8443-42eb-ab9e-47e654eb8dca" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.943658 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7799ac-8443-42eb-ab9e-47e654eb8dca" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.943874 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7799ac-8443-42eb-ab9e-47e654eb8dca" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.943899 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="363e90c7-2daa-4777-bb42-3d82405bedff" containerName="oc" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.944730 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.951815 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.951985 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.952087 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.952208 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.956682 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sh6cs"] Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.983875 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tmlv\" (UniqueName: \"kubernetes.io/projected/799f12f9-8c31-4518-a210-e117606e6d8e-kube-api-access-6tmlv\") pod \"ssh-known-hosts-edpm-deployment-sh6cs\" (UID: \"799f12f9-8c31-4518-a210-e117606e6d8e\") " pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.983953 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/799f12f9-8c31-4518-a210-e117606e6d8e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sh6cs\" (UID: \"799f12f9-8c31-4518-a210-e117606e6d8e\") " pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" Mar 18 14:34:30 crc kubenswrapper[4756]: I0318 14:34:30.984541 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/799f12f9-8c31-4518-a210-e117606e6d8e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sh6cs\" (UID: \"799f12f9-8c31-4518-a210-e117606e6d8e\") " pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" Mar 18 14:34:31 crc kubenswrapper[4756]: I0318 14:34:31.086639 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/799f12f9-8c31-4518-a210-e117606e6d8e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sh6cs\" (UID: \"799f12f9-8c31-4518-a210-e117606e6d8e\") " pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" Mar 18 14:34:31 crc kubenswrapper[4756]: I0318 14:34:31.086737 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tmlv\" (UniqueName: \"kubernetes.io/projected/799f12f9-8c31-4518-a210-e117606e6d8e-kube-api-access-6tmlv\") pod \"ssh-known-hosts-edpm-deployment-sh6cs\" (UID: \"799f12f9-8c31-4518-a210-e117606e6d8e\") " pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" Mar 18 14:34:31 crc kubenswrapper[4756]: I0318 14:34:31.086767 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/799f12f9-8c31-4518-a210-e117606e6d8e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sh6cs\" (UID: \"799f12f9-8c31-4518-a210-e117606e6d8e\") " pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" Mar 18 14:34:31 crc kubenswrapper[4756]: I0318 14:34:31.091402 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/799f12f9-8c31-4518-a210-e117606e6d8e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sh6cs\" (UID: \"799f12f9-8c31-4518-a210-e117606e6d8e\") " pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" Mar 18 14:34:31 crc kubenswrapper[4756]: I0318 14:34:31.092232 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/799f12f9-8c31-4518-a210-e117606e6d8e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sh6cs\" (UID: \"799f12f9-8c31-4518-a210-e117606e6d8e\") " pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" Mar 18 14:34:31 crc kubenswrapper[4756]: I0318 14:34:31.105533 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tmlv\" (UniqueName: \"kubernetes.io/projected/799f12f9-8c31-4518-a210-e117606e6d8e-kube-api-access-6tmlv\") pod \"ssh-known-hosts-edpm-deployment-sh6cs\" (UID: \"799f12f9-8c31-4518-a210-e117606e6d8e\") " pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" Mar 18 14:34:31 crc kubenswrapper[4756]: I0318 14:34:31.272802 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" Mar 18 14:34:31 crc kubenswrapper[4756]: I0318 14:34:31.810994 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sh6cs"] Mar 18 14:34:31 crc kubenswrapper[4756]: I0318 14:34:31.842933 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" event={"ID":"799f12f9-8c31-4518-a210-e117606e6d8e","Type":"ContainerStarted","Data":"efc9df5020deb6ac8100efe482f96d9424271a8c3753263282c9e6fa302686c8"} Mar 18 14:34:32 crc kubenswrapper[4756]: I0318 14:34:32.313251 4756 scope.go:117] "RemoveContainer" containerID="f1203b8dbbf22be7e38417c9cf29ba809735cba1c0199aaa108f2daa976be915" Mar 18 14:34:32 crc kubenswrapper[4756]: I0318 14:34:32.479930 4756 scope.go:117] "RemoveContainer" containerID="429e283dccfbbf95903a51f71e72765e9d92ca4abec276da08636ca53dd71fce" Mar 18 14:34:32 crc kubenswrapper[4756]: I0318 14:34:32.537635 4756 scope.go:117] "RemoveContainer" containerID="3d358fd515b011708016827117bac120bc582a21947c91d875a22ff1b818a612" Mar 18 14:34:32 crc kubenswrapper[4756]: I0318 14:34:32.600731 4756 scope.go:117] "RemoveContainer" containerID="3d7edb2d29f4b86e999900f63630402ea7f227c196f2b8799d8d520d08f29895" Mar 18 14:34:32 crc kubenswrapper[4756]: I0318 14:34:32.856071 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" event={"ID":"799f12f9-8c31-4518-a210-e117606e6d8e","Type":"ContainerStarted","Data":"79bf195da55b9c847c517358a40703816704906460a6909e770f37c974290e07"} Mar 18 14:34:32 crc kubenswrapper[4756]: I0318 14:34:32.878289 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" podStartSLOduration=2.424452757 podStartE2EDuration="2.878263299s" podCreationTimestamp="2026-03-18 14:34:30 +0000 UTC" firstStartedPulling="2026-03-18 14:34:31.818840505 +0000 UTC m=+2073.133258470" lastFinishedPulling="2026-03-18 14:34:32.272650997 +0000 UTC m=+2073.587069012" observedRunningTime="2026-03-18 14:34:32.871799124 +0000 UTC m=+2074.186217149" watchObservedRunningTime="2026-03-18 14:34:32.878263299 +0000 UTC m=+2074.192681304" Mar 18 14:34:37 crc kubenswrapper[4756]: I0318 14:34:37.316474 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:34:37 crc kubenswrapper[4756]: I0318 14:34:37.924550 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"47fe16283e653d97b821ea41888e12ff278529bb4d88752b5670809f30122a6d"} Mar 18 14:34:38 crc kubenswrapper[4756]: I0318 14:34:38.935931 4756 generic.go:334] "Generic (PLEG): container finished" podID="799f12f9-8c31-4518-a210-e117606e6d8e" containerID="79bf195da55b9c847c517358a40703816704906460a6909e770f37c974290e07" exitCode=0 Mar 18 14:34:38 crc kubenswrapper[4756]: I0318 14:34:38.936036 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" event={"ID":"799f12f9-8c31-4518-a210-e117606e6d8e","Type":"ContainerDied","Data":"79bf195da55b9c847c517358a40703816704906460a6909e770f37c974290e07"} Mar 18 14:34:40 crc kubenswrapper[4756]: I0318 14:34:40.478187 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" Mar 18 14:34:40 crc kubenswrapper[4756]: I0318 14:34:40.595901 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/799f12f9-8c31-4518-a210-e117606e6d8e-ssh-key-openstack-edpm-ipam\") pod \"799f12f9-8c31-4518-a210-e117606e6d8e\" (UID: \"799f12f9-8c31-4518-a210-e117606e6d8e\") " Mar 18 14:34:40 crc kubenswrapper[4756]: I0318 14:34:40.596080 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/799f12f9-8c31-4518-a210-e117606e6d8e-inventory-0\") pod \"799f12f9-8c31-4518-a210-e117606e6d8e\" (UID: \"799f12f9-8c31-4518-a210-e117606e6d8e\") " Mar 18 14:34:40 crc kubenswrapper[4756]: I0318 14:34:40.596342 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tmlv\" (UniqueName: \"kubernetes.io/projected/799f12f9-8c31-4518-a210-e117606e6d8e-kube-api-access-6tmlv\") pod \"799f12f9-8c31-4518-a210-e117606e6d8e\" (UID: \"799f12f9-8c31-4518-a210-e117606e6d8e\") " Mar 18 14:34:40 crc kubenswrapper[4756]: I0318 14:34:40.602677 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799f12f9-8c31-4518-a210-e117606e6d8e-kube-api-access-6tmlv" (OuterVolumeSpecName: "kube-api-access-6tmlv") pod "799f12f9-8c31-4518-a210-e117606e6d8e" (UID: "799f12f9-8c31-4518-a210-e117606e6d8e"). InnerVolumeSpecName "kube-api-access-6tmlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:34:40 crc kubenswrapper[4756]: I0318 14:34:40.625979 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799f12f9-8c31-4518-a210-e117606e6d8e-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "799f12f9-8c31-4518-a210-e117606e6d8e" (UID: "799f12f9-8c31-4518-a210-e117606e6d8e"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:34:40 crc kubenswrapper[4756]: I0318 14:34:40.647274 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799f12f9-8c31-4518-a210-e117606e6d8e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "799f12f9-8c31-4518-a210-e117606e6d8e" (UID: "799f12f9-8c31-4518-a210-e117606e6d8e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:34:40 crc kubenswrapper[4756]: I0318 14:34:40.698422 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/799f12f9-8c31-4518-a210-e117606e6d8e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:34:40 crc kubenswrapper[4756]: I0318 14:34:40.698473 4756 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/799f12f9-8c31-4518-a210-e117606e6d8e-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:34:40 crc kubenswrapper[4756]: I0318 14:34:40.698495 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tmlv\" (UniqueName: \"kubernetes.io/projected/799f12f9-8c31-4518-a210-e117606e6d8e-kube-api-access-6tmlv\") on node \"crc\" DevicePath \"\"" Mar 18 14:34:40 crc kubenswrapper[4756]: I0318 14:34:40.962674 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" event={"ID":"799f12f9-8c31-4518-a210-e117606e6d8e","Type":"ContainerDied","Data":"efc9df5020deb6ac8100efe482f96d9424271a8c3753263282c9e6fa302686c8"} Mar 18 14:34:40 crc kubenswrapper[4756]: I0318 14:34:40.962999 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efc9df5020deb6ac8100efe482f96d9424271a8c3753263282c9e6fa302686c8" Mar 18 14:34:40 crc kubenswrapper[4756]: I0318 14:34:40.962763 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sh6cs" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.051749 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj"] Mar 18 14:34:41 crc kubenswrapper[4756]: E0318 14:34:41.052273 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799f12f9-8c31-4518-a210-e117606e6d8e" containerName="ssh-known-hosts-edpm-deployment" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.052296 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="799f12f9-8c31-4518-a210-e117606e6d8e" containerName="ssh-known-hosts-edpm-deployment" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.052553 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="799f12f9-8c31-4518-a210-e117606e6d8e" containerName="ssh-known-hosts-edpm-deployment" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.053474 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.055755 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.056972 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.061079 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.061555 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.066407 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj"] Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.209256 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdhlq\" (UniqueName: \"kubernetes.io/projected/39717331-913f-4e8f-b7c1-e8f8148dcd92-kube-api-access-vdhlq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vb9nj\" (UID: \"39717331-913f-4e8f-b7c1-e8f8148dcd92\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.209385 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39717331-913f-4e8f-b7c1-e8f8148dcd92-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vb9nj\" (UID: \"39717331-913f-4e8f-b7c1-e8f8148dcd92\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.209523 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39717331-913f-4e8f-b7c1-e8f8148dcd92-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vb9nj\" (UID: \"39717331-913f-4e8f-b7c1-e8f8148dcd92\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.312066 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39717331-913f-4e8f-b7c1-e8f8148dcd92-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vb9nj\" (UID: \"39717331-913f-4e8f-b7c1-e8f8148dcd92\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.312298 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdhlq\" (UniqueName: \"kubernetes.io/projected/39717331-913f-4e8f-b7c1-e8f8148dcd92-kube-api-access-vdhlq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vb9nj\" (UID: \"39717331-913f-4e8f-b7c1-e8f8148dcd92\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.312396 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39717331-913f-4e8f-b7c1-e8f8148dcd92-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vb9nj\" (UID: \"39717331-913f-4e8f-b7c1-e8f8148dcd92\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.318375 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39717331-913f-4e8f-b7c1-e8f8148dcd92-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vb9nj\" (UID: \"39717331-913f-4e8f-b7c1-e8f8148dcd92\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.318673 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39717331-913f-4e8f-b7c1-e8f8148dcd92-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vb9nj\" (UID: \"39717331-913f-4e8f-b7c1-e8f8148dcd92\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.351456 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdhlq\" (UniqueName: \"kubernetes.io/projected/39717331-913f-4e8f-b7c1-e8f8148dcd92-kube-api-access-vdhlq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vb9nj\" (UID: \"39717331-913f-4e8f-b7c1-e8f8148dcd92\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.379402 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" Mar 18 14:34:41 crc kubenswrapper[4756]: W0318 14:34:41.992438 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39717331_913f_4e8f_b7c1_e8f8148dcd92.slice/crio-2f3f7c01dc7e55aedbd4c0e8cb7a40a0ff6eeb7eee2d959080e9d0813c967308 WatchSource:0}: Error finding container 2f3f7c01dc7e55aedbd4c0e8cb7a40a0ff6eeb7eee2d959080e9d0813c967308: Status 404 returned error can't find the container with id 2f3f7c01dc7e55aedbd4c0e8cb7a40a0ff6eeb7eee2d959080e9d0813c967308 Mar 18 14:34:41 crc kubenswrapper[4756]: I0318 14:34:41.997682 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj"] Mar 18 14:34:43 crc kubenswrapper[4756]: I0318 14:34:43.012403 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" event={"ID":"39717331-913f-4e8f-b7c1-e8f8148dcd92","Type":"ContainerStarted","Data":"d05d3aa061ffc36d0b7577a9c0a0302edd96fd5280184b7d8db0636113c66d5d"} Mar 18 14:34:43 crc kubenswrapper[4756]: I0318 14:34:43.012895 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" event={"ID":"39717331-913f-4e8f-b7c1-e8f8148dcd92","Type":"ContainerStarted","Data":"2f3f7c01dc7e55aedbd4c0e8cb7a40a0ff6eeb7eee2d959080e9d0813c967308"} Mar 18 14:34:43 crc kubenswrapper[4756]: I0318 14:34:43.031631 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" podStartSLOduration=1.49087388 podStartE2EDuration="2.031613679s" podCreationTimestamp="2026-03-18 14:34:41 +0000 UTC" firstStartedPulling="2026-03-18 14:34:41.998782874 +0000 UTC m=+2083.313200849" lastFinishedPulling="2026-03-18 14:34:42.539522673 +0000 UTC m=+2083.853940648" observedRunningTime="2026-03-18 14:34:43.028111635 +0000 UTC m=+2084.342529620" watchObservedRunningTime="2026-03-18 14:34:43.031613679 +0000 UTC m=+2084.346031654" Mar 18 14:34:51 crc kubenswrapper[4756]: I0318 14:34:51.102053 4756 generic.go:334] "Generic (PLEG): container finished" podID="39717331-913f-4e8f-b7c1-e8f8148dcd92" containerID="d05d3aa061ffc36d0b7577a9c0a0302edd96fd5280184b7d8db0636113c66d5d" exitCode=0 Mar 18 14:34:51 crc kubenswrapper[4756]: I0318 14:34:51.102171 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" event={"ID":"39717331-913f-4e8f-b7c1-e8f8148dcd92","Type":"ContainerDied","Data":"d05d3aa061ffc36d0b7577a9c0a0302edd96fd5280184b7d8db0636113c66d5d"} Mar 18 14:34:52 crc kubenswrapper[4756]: I0318 14:34:52.606086 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" Mar 18 14:34:52 crc kubenswrapper[4756]: I0318 14:34:52.671383 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdhlq\" (UniqueName: \"kubernetes.io/projected/39717331-913f-4e8f-b7c1-e8f8148dcd92-kube-api-access-vdhlq\") pod \"39717331-913f-4e8f-b7c1-e8f8148dcd92\" (UID: \"39717331-913f-4e8f-b7c1-e8f8148dcd92\") " Mar 18 14:34:52 crc kubenswrapper[4756]: I0318 14:34:52.671477 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39717331-913f-4e8f-b7c1-e8f8148dcd92-inventory\") pod \"39717331-913f-4e8f-b7c1-e8f8148dcd92\" (UID: \"39717331-913f-4e8f-b7c1-e8f8148dcd92\") " Mar 18 14:34:52 crc kubenswrapper[4756]: I0318 14:34:52.671567 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39717331-913f-4e8f-b7c1-e8f8148dcd92-ssh-key-openstack-edpm-ipam\") pod \"39717331-913f-4e8f-b7c1-e8f8148dcd92\" (UID: \"39717331-913f-4e8f-b7c1-e8f8148dcd92\") " Mar 18 14:34:52 crc kubenswrapper[4756]: I0318 14:34:52.678366 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39717331-913f-4e8f-b7c1-e8f8148dcd92-kube-api-access-vdhlq" (OuterVolumeSpecName: "kube-api-access-vdhlq") pod "39717331-913f-4e8f-b7c1-e8f8148dcd92" (UID: "39717331-913f-4e8f-b7c1-e8f8148dcd92"). InnerVolumeSpecName "kube-api-access-vdhlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:34:52 crc kubenswrapper[4756]: I0318 14:34:52.714865 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39717331-913f-4e8f-b7c1-e8f8148dcd92-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "39717331-913f-4e8f-b7c1-e8f8148dcd92" (UID: "39717331-913f-4e8f-b7c1-e8f8148dcd92"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:34:52 crc kubenswrapper[4756]: I0318 14:34:52.715591 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39717331-913f-4e8f-b7c1-e8f8148dcd92-inventory" (OuterVolumeSpecName: "inventory") pod "39717331-913f-4e8f-b7c1-e8f8148dcd92" (UID: "39717331-913f-4e8f-b7c1-e8f8148dcd92"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:34:52 crc kubenswrapper[4756]: I0318 14:34:52.774410 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdhlq\" (UniqueName: \"kubernetes.io/projected/39717331-913f-4e8f-b7c1-e8f8148dcd92-kube-api-access-vdhlq\") on node \"crc\" DevicePath \"\"" Mar 18 14:34:52 crc kubenswrapper[4756]: I0318 14:34:52.774452 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39717331-913f-4e8f-b7c1-e8f8148dcd92-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:34:52 crc kubenswrapper[4756]: I0318 14:34:52.774465 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39717331-913f-4e8f-b7c1-e8f8148dcd92-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.036008 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-fttqt"] Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.045390 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-fttqt"] Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.125657 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" event={"ID":"39717331-913f-4e8f-b7c1-e8f8148dcd92","Type":"ContainerDied","Data":"2f3f7c01dc7e55aedbd4c0e8cb7a40a0ff6eeb7eee2d959080e9d0813c967308"} Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.125714 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f3f7c01dc7e55aedbd4c0e8cb7a40a0ff6eeb7eee2d959080e9d0813c967308" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.125717 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vb9nj" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.222864 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc"] Mar 18 14:34:53 crc kubenswrapper[4756]: E0318 14:34:53.223422 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39717331-913f-4e8f-b7c1-e8f8148dcd92" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.223444 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="39717331-913f-4e8f-b7c1-e8f8148dcd92" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.223691 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="39717331-913f-4e8f-b7c1-e8f8148dcd92" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.224640 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.226967 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.227514 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.228859 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.229210 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.244646 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc"] Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.326656 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2217a20d-e435-4926-a713-89fc852aab36" path="/var/lib/kubelet/pods/2217a20d-e435-4926-a713-89fc852aab36/volumes" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.384794 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b9d661cc-2616-43d2-9db7-7111e119a569-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc\" (UID: \"b9d661cc-2616-43d2-9db7-7111e119a569\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.384861 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9d661cc-2616-43d2-9db7-7111e119a569-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc\" (UID: \"b9d661cc-2616-43d2-9db7-7111e119a569\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.385589 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpb87\" (UniqueName: \"kubernetes.io/projected/b9d661cc-2616-43d2-9db7-7111e119a569-kube-api-access-rpb87\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc\" (UID: \"b9d661cc-2616-43d2-9db7-7111e119a569\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.487360 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpb87\" (UniqueName: \"kubernetes.io/projected/b9d661cc-2616-43d2-9db7-7111e119a569-kube-api-access-rpb87\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc\" (UID: \"b9d661cc-2616-43d2-9db7-7111e119a569\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.487584 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b9d661cc-2616-43d2-9db7-7111e119a569-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc\" (UID: \"b9d661cc-2616-43d2-9db7-7111e119a569\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.487700 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9d661cc-2616-43d2-9db7-7111e119a569-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc\" (UID: \"b9d661cc-2616-43d2-9db7-7111e119a569\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.495228 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9d661cc-2616-43d2-9db7-7111e119a569-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc\" (UID: \"b9d661cc-2616-43d2-9db7-7111e119a569\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.495349 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b9d661cc-2616-43d2-9db7-7111e119a569-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc\" (UID: \"b9d661cc-2616-43d2-9db7-7111e119a569\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.520893 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpb87\" (UniqueName: \"kubernetes.io/projected/b9d661cc-2616-43d2-9db7-7111e119a569-kube-api-access-rpb87\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc\" (UID: \"b9d661cc-2616-43d2-9db7-7111e119a569\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" Mar 18 14:34:53 crc kubenswrapper[4756]: I0318 14:34:53.545479 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" Mar 18 14:34:54 crc kubenswrapper[4756]: I0318 14:34:54.122855 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc"] Mar 18 14:34:54 crc kubenswrapper[4756]: W0318 14:34:54.128536 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9d661cc_2616_43d2_9db7_7111e119a569.slice/crio-059307a2100d9027f9f34eca2e325a2f7b37e5bc609c52b283a8a7ba415db0d8 WatchSource:0}: Error finding container 059307a2100d9027f9f34eca2e325a2f7b37e5bc609c52b283a8a7ba415db0d8: Status 404 returned error can't find the container with id 059307a2100d9027f9f34eca2e325a2f7b37e5bc609c52b283a8a7ba415db0d8 Mar 18 14:34:55 crc kubenswrapper[4756]: I0318 14:34:55.148099 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" event={"ID":"b9d661cc-2616-43d2-9db7-7111e119a569","Type":"ContainerStarted","Data":"095eb23a5d1ac13a31014d3ed785fce214359d8b33985406f8f8d68e5e9394da"} Mar 18 14:34:55 crc kubenswrapper[4756]: I0318 14:34:55.148812 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" event={"ID":"b9d661cc-2616-43d2-9db7-7111e119a569","Type":"ContainerStarted","Data":"059307a2100d9027f9f34eca2e325a2f7b37e5bc609c52b283a8a7ba415db0d8"} Mar 18 14:34:55 crc kubenswrapper[4756]: I0318 14:34:55.169317 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" podStartSLOduration=1.728913785 podStartE2EDuration="2.169293296s" podCreationTimestamp="2026-03-18 14:34:53 +0000 UTC" firstStartedPulling="2026-03-18 14:34:54.131874926 +0000 UTC m=+2095.446292911" lastFinishedPulling="2026-03-18 14:34:54.572254437 +0000 UTC m=+2095.886672422" observedRunningTime="2026-03-18 14:34:55.165960906 +0000 UTC m=+2096.480378891" watchObservedRunningTime="2026-03-18 14:34:55.169293296 +0000 UTC m=+2096.483711301" Mar 18 14:34:58 crc kubenswrapper[4756]: I0318 14:34:58.464265 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s9gz8"] Mar 18 14:34:58 crc kubenswrapper[4756]: I0318 14:34:58.470033 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:34:58 crc kubenswrapper[4756]: I0318 14:34:58.479331 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s9gz8"] Mar 18 14:34:58 crc kubenswrapper[4756]: I0318 14:34:58.511248 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211f866c-5d09-415c-ae4a-b579240dea79-utilities\") pod \"redhat-operators-s9gz8\" (UID: \"211f866c-5d09-415c-ae4a-b579240dea79\") " pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:34:58 crc kubenswrapper[4756]: I0318 14:34:58.511300 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgl8f\" (UniqueName: \"kubernetes.io/projected/211f866c-5d09-415c-ae4a-b579240dea79-kube-api-access-hgl8f\") pod \"redhat-operators-s9gz8\" (UID: \"211f866c-5d09-415c-ae4a-b579240dea79\") " pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:34:58 crc kubenswrapper[4756]: I0318 14:34:58.511368 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211f866c-5d09-415c-ae4a-b579240dea79-catalog-content\") pod \"redhat-operators-s9gz8\" (UID: \"211f866c-5d09-415c-ae4a-b579240dea79\") " pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:34:58 crc kubenswrapper[4756]: I0318 14:34:58.613173 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211f866c-5d09-415c-ae4a-b579240dea79-utilities\") pod \"redhat-operators-s9gz8\" (UID: \"211f866c-5d09-415c-ae4a-b579240dea79\") " pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:34:58 crc kubenswrapper[4756]: I0318 14:34:58.613529 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgl8f\" (UniqueName: \"kubernetes.io/projected/211f866c-5d09-415c-ae4a-b579240dea79-kube-api-access-hgl8f\") pod \"redhat-operators-s9gz8\" (UID: \"211f866c-5d09-415c-ae4a-b579240dea79\") " pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:34:58 crc kubenswrapper[4756]: I0318 14:34:58.613718 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211f866c-5d09-415c-ae4a-b579240dea79-catalog-content\") pod \"redhat-operators-s9gz8\" (UID: \"211f866c-5d09-415c-ae4a-b579240dea79\") " pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:34:58 crc kubenswrapper[4756]: I0318 14:34:58.613722 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211f866c-5d09-415c-ae4a-b579240dea79-utilities\") pod \"redhat-operators-s9gz8\" (UID: \"211f866c-5d09-415c-ae4a-b579240dea79\") " pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:34:58 crc kubenswrapper[4756]: I0318 14:34:58.614226 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211f866c-5d09-415c-ae4a-b579240dea79-catalog-content\") pod \"redhat-operators-s9gz8\" (UID: \"211f866c-5d09-415c-ae4a-b579240dea79\") " pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:34:58 crc kubenswrapper[4756]: I0318 14:34:58.636989 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgl8f\" (UniqueName: \"kubernetes.io/projected/211f866c-5d09-415c-ae4a-b579240dea79-kube-api-access-hgl8f\") pod \"redhat-operators-s9gz8\" (UID: \"211f866c-5d09-415c-ae4a-b579240dea79\") " pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:34:58 crc kubenswrapper[4756]: I0318 14:34:58.799906 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:34:59 crc kubenswrapper[4756]: I0318 14:34:59.331353 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s9gz8"] Mar 18 14:35:00 crc kubenswrapper[4756]: I0318 14:35:00.207514 4756 generic.go:334] "Generic (PLEG): container finished" podID="211f866c-5d09-415c-ae4a-b579240dea79" containerID="653aa355a25867f2914324c2f547514e791ee9afe11c45d13468d74ca20d7e91" exitCode=0 Mar 18 14:35:00 crc kubenswrapper[4756]: I0318 14:35:00.207612 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9gz8" event={"ID":"211f866c-5d09-415c-ae4a-b579240dea79","Type":"ContainerDied","Data":"653aa355a25867f2914324c2f547514e791ee9afe11c45d13468d74ca20d7e91"} Mar 18 14:35:00 crc kubenswrapper[4756]: I0318 14:35:00.207994 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9gz8" event={"ID":"211f866c-5d09-415c-ae4a-b579240dea79","Type":"ContainerStarted","Data":"618b8b38dcb2c2fbaea39e95b58f39690981f945bcb6b2b90646354d95a4ae16"} Mar 18 14:35:02 crc kubenswrapper[4756]: I0318 14:35:02.233916 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9gz8" event={"ID":"211f866c-5d09-415c-ae4a-b579240dea79","Type":"ContainerStarted","Data":"556f3e42d33391f2d646a60cb00222706f325c2efd54aae92a4b60312cdbf32f"} Mar 18 14:35:04 crc kubenswrapper[4756]: I0318 14:35:04.256935 4756 generic.go:334] "Generic (PLEG): container finished" podID="b9d661cc-2616-43d2-9db7-7111e119a569" containerID="095eb23a5d1ac13a31014d3ed785fce214359d8b33985406f8f8d68e5e9394da" exitCode=0 Mar 18 14:35:04 crc kubenswrapper[4756]: I0318 14:35:04.256981 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" event={"ID":"b9d661cc-2616-43d2-9db7-7111e119a569","Type":"ContainerDied","Data":"095eb23a5d1ac13a31014d3ed785fce214359d8b33985406f8f8d68e5e9394da"} Mar 18 14:35:05 crc kubenswrapper[4756]: I0318 14:35:05.274928 4756 generic.go:334] "Generic (PLEG): container finished" podID="211f866c-5d09-415c-ae4a-b579240dea79" containerID="556f3e42d33391f2d646a60cb00222706f325c2efd54aae92a4b60312cdbf32f" exitCode=0 Mar 18 14:35:05 crc kubenswrapper[4756]: I0318 14:35:05.274986 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9gz8" event={"ID":"211f866c-5d09-415c-ae4a-b579240dea79","Type":"ContainerDied","Data":"556f3e42d33391f2d646a60cb00222706f325c2efd54aae92a4b60312cdbf32f"} Mar 18 14:35:05 crc kubenswrapper[4756]: I0318 14:35:05.859754 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" Mar 18 14:35:05 crc kubenswrapper[4756]: I0318 14:35:05.988515 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b9d661cc-2616-43d2-9db7-7111e119a569-ssh-key-openstack-edpm-ipam\") pod \"b9d661cc-2616-43d2-9db7-7111e119a569\" (UID: \"b9d661cc-2616-43d2-9db7-7111e119a569\") " Mar 18 14:35:05 crc kubenswrapper[4756]: I0318 14:35:05.988698 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9d661cc-2616-43d2-9db7-7111e119a569-inventory\") pod \"b9d661cc-2616-43d2-9db7-7111e119a569\" (UID: \"b9d661cc-2616-43d2-9db7-7111e119a569\") " Mar 18 14:35:05 crc kubenswrapper[4756]: I0318 14:35:05.988825 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpb87\" (UniqueName: \"kubernetes.io/projected/b9d661cc-2616-43d2-9db7-7111e119a569-kube-api-access-rpb87\") pod \"b9d661cc-2616-43d2-9db7-7111e119a569\" (UID: \"b9d661cc-2616-43d2-9db7-7111e119a569\") " Mar 18 14:35:05 crc kubenswrapper[4756]: I0318 14:35:05.994014 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9d661cc-2616-43d2-9db7-7111e119a569-kube-api-access-rpb87" (OuterVolumeSpecName: "kube-api-access-rpb87") pod "b9d661cc-2616-43d2-9db7-7111e119a569" (UID: "b9d661cc-2616-43d2-9db7-7111e119a569"). InnerVolumeSpecName "kube-api-access-rpb87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.018240 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9d661cc-2616-43d2-9db7-7111e119a569-inventory" (OuterVolumeSpecName: "inventory") pod "b9d661cc-2616-43d2-9db7-7111e119a569" (UID: "b9d661cc-2616-43d2-9db7-7111e119a569"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.019812 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9d661cc-2616-43d2-9db7-7111e119a569-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b9d661cc-2616-43d2-9db7-7111e119a569" (UID: "b9d661cc-2616-43d2-9db7-7111e119a569"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.091984 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b9d661cc-2616-43d2-9db7-7111e119a569-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.092020 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9d661cc-2616-43d2-9db7-7111e119a569-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.092065 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpb87\" (UniqueName: \"kubernetes.io/projected/b9d661cc-2616-43d2-9db7-7111e119a569-kube-api-access-rpb87\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.284879 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" event={"ID":"b9d661cc-2616-43d2-9db7-7111e119a569","Type":"ContainerDied","Data":"059307a2100d9027f9f34eca2e325a2f7b37e5bc609c52b283a8a7ba415db0d8"} Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.284923 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="059307a2100d9027f9f34eca2e325a2f7b37e5bc609c52b283a8a7ba415db0d8" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.284971 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.291457 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9gz8" event={"ID":"211f866c-5d09-415c-ae4a-b579240dea79","Type":"ContainerStarted","Data":"630b6dd77409ba7373ac305ac878dd1cc21f7e30cb20c62c26a34c0ac8ea6657"} Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.326199 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s9gz8" podStartSLOduration=2.765390616 podStartE2EDuration="8.326180141s" podCreationTimestamp="2026-03-18 14:34:58 +0000 UTC" firstStartedPulling="2026-03-18 14:35:00.209906278 +0000 UTC m=+2101.524324263" lastFinishedPulling="2026-03-18 14:35:05.770695803 +0000 UTC m=+2107.085113788" observedRunningTime="2026-03-18 14:35:06.310086377 +0000 UTC m=+2107.624504362" watchObservedRunningTime="2026-03-18 14:35:06.326180141 +0000 UTC m=+2107.640598116" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.367725 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm"] Mar 18 14:35:06 crc kubenswrapper[4756]: E0318 14:35:06.368219 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9d661cc-2616-43d2-9db7-7111e119a569" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.368242 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9d661cc-2616-43d2-9db7-7111e119a569" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.368478 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9d661cc-2616-43d2-9db7-7111e119a569" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.369466 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.371750 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.372038 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.372091 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.372040 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.372179 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.372482 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.372762 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.375682 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.391187 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm"] Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.503426 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.503486 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.503517 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.503538 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm24t\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-kube-api-access-mm24t\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.503628 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.503679 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.503713 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.503741 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.503797 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.503824 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.503870 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.503908 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.504075 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.504327 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.606918 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.606976 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.607003 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.607025 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.607056 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.607080 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.607135 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.607176 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.607213 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.607259 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.607412 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.607449 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.607485 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.607515 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm24t\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-kube-api-access-mm24t\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.613396 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.613996 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.614014 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.614279 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.614658 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.614698 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.614807 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.614916 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.614860 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.615188 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.615330 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.615482 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.617498 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.624522 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm24t\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-kube-api-access-mm24t\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:06 crc kubenswrapper[4756]: I0318 14:35:06.692661 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:07 crc kubenswrapper[4756]: W0318 14:35:07.326551 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeda86959_5a56_443e_b21e_9d0dcd73e6b6.slice/crio-22fbeddabe0473c48e421e3e16af98dea7822a73768ba74cca00e9b6766f1341 WatchSource:0}: Error finding container 22fbeddabe0473c48e421e3e16af98dea7822a73768ba74cca00e9b6766f1341: Status 404 returned error can't find the container with id 22fbeddabe0473c48e421e3e16af98dea7822a73768ba74cca00e9b6766f1341 Mar 18 14:35:07 crc kubenswrapper[4756]: I0318 14:35:07.330357 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm"] Mar 18 14:35:08 crc kubenswrapper[4756]: I0318 14:35:08.316235 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" event={"ID":"eda86959-5a56-443e-b21e-9d0dcd73e6b6","Type":"ContainerStarted","Data":"b3f99537810f2b9419119d42256407f4a984f073ef8b83ff4e6177108210a94a"} Mar 18 14:35:08 crc kubenswrapper[4756]: I0318 14:35:08.316632 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" event={"ID":"eda86959-5a56-443e-b21e-9d0dcd73e6b6","Type":"ContainerStarted","Data":"22fbeddabe0473c48e421e3e16af98dea7822a73768ba74cca00e9b6766f1341"} Mar 18 14:35:08 crc kubenswrapper[4756]: I0318 14:35:08.338744 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" podStartSLOduration=1.862996533 podStartE2EDuration="2.338723338s" podCreationTimestamp="2026-03-18 14:35:06 +0000 UTC" firstStartedPulling="2026-03-18 14:35:07.328903913 +0000 UTC m=+2108.643321888" lastFinishedPulling="2026-03-18 14:35:07.804630708 +0000 UTC m=+2109.119048693" observedRunningTime="2026-03-18 14:35:08.33843939 +0000 UTC m=+2109.652857375" watchObservedRunningTime="2026-03-18 14:35:08.338723338 +0000 UTC m=+2109.653141313" Mar 18 14:35:08 crc kubenswrapper[4756]: I0318 14:35:08.800325 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:35:08 crc kubenswrapper[4756]: I0318 14:35:08.800775 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:35:09 crc kubenswrapper[4756]: I0318 14:35:09.897387 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s9gz8" podUID="211f866c-5d09-415c-ae4a-b579240dea79" containerName="registry-server" probeResult="failure" output=< Mar 18 14:35:09 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:35:09 crc kubenswrapper[4756]: > Mar 18 14:35:18 crc kubenswrapper[4756]: I0318 14:35:18.923837 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:35:19 crc kubenswrapper[4756]: I0318 14:35:19.016695 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:35:19 crc kubenswrapper[4756]: I0318 14:35:19.170595 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s9gz8"] Mar 18 14:35:20 crc kubenswrapper[4756]: I0318 14:35:20.472178 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s9gz8" podUID="211f866c-5d09-415c-ae4a-b579240dea79" containerName="registry-server" containerID="cri-o://630b6dd77409ba7373ac305ac878dd1cc21f7e30cb20c62c26a34c0ac8ea6657" gracePeriod=2 Mar 18 14:35:20 crc kubenswrapper[4756]: I0318 14:35:20.969053 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.137201 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgl8f\" (UniqueName: \"kubernetes.io/projected/211f866c-5d09-415c-ae4a-b579240dea79-kube-api-access-hgl8f\") pod \"211f866c-5d09-415c-ae4a-b579240dea79\" (UID: \"211f866c-5d09-415c-ae4a-b579240dea79\") " Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.137586 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211f866c-5d09-415c-ae4a-b579240dea79-utilities\") pod \"211f866c-5d09-415c-ae4a-b579240dea79\" (UID: \"211f866c-5d09-415c-ae4a-b579240dea79\") " Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.137647 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211f866c-5d09-415c-ae4a-b579240dea79-catalog-content\") pod \"211f866c-5d09-415c-ae4a-b579240dea79\" (UID: \"211f866c-5d09-415c-ae4a-b579240dea79\") " Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.138693 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/211f866c-5d09-415c-ae4a-b579240dea79-utilities" (OuterVolumeSpecName: "utilities") pod "211f866c-5d09-415c-ae4a-b579240dea79" (UID: "211f866c-5d09-415c-ae4a-b579240dea79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.147724 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211f866c-5d09-415c-ae4a-b579240dea79-kube-api-access-hgl8f" (OuterVolumeSpecName: "kube-api-access-hgl8f") pod "211f866c-5d09-415c-ae4a-b579240dea79" (UID: "211f866c-5d09-415c-ae4a-b579240dea79"). InnerVolumeSpecName "kube-api-access-hgl8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.240809 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211f866c-5d09-415c-ae4a-b579240dea79-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.240858 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgl8f\" (UniqueName: \"kubernetes.io/projected/211f866c-5d09-415c-ae4a-b579240dea79-kube-api-access-hgl8f\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.305138 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/211f866c-5d09-415c-ae4a-b579240dea79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "211f866c-5d09-415c-ae4a-b579240dea79" (UID: "211f866c-5d09-415c-ae4a-b579240dea79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.343469 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211f866c-5d09-415c-ae4a-b579240dea79-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.490533 4756 generic.go:334] "Generic (PLEG): container finished" podID="211f866c-5d09-415c-ae4a-b579240dea79" containerID="630b6dd77409ba7373ac305ac878dd1cc21f7e30cb20c62c26a34c0ac8ea6657" exitCode=0 Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.490581 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9gz8" event={"ID":"211f866c-5d09-415c-ae4a-b579240dea79","Type":"ContainerDied","Data":"630b6dd77409ba7373ac305ac878dd1cc21f7e30cb20c62c26a34c0ac8ea6657"} Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.490612 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9gz8" event={"ID":"211f866c-5d09-415c-ae4a-b579240dea79","Type":"ContainerDied","Data":"618b8b38dcb2c2fbaea39e95b58f39690981f945bcb6b2b90646354d95a4ae16"} Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.490632 4756 scope.go:117] "RemoveContainer" containerID="630b6dd77409ba7373ac305ac878dd1cc21f7e30cb20c62c26a34c0ac8ea6657" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.490809 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s9gz8" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.515005 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s9gz8"] Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.522365 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s9gz8"] Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.549853 4756 scope.go:117] "RemoveContainer" containerID="556f3e42d33391f2d646a60cb00222706f325c2efd54aae92a4b60312cdbf32f" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.587552 4756 scope.go:117] "RemoveContainer" containerID="653aa355a25867f2914324c2f547514e791ee9afe11c45d13468d74ca20d7e91" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.621187 4756 scope.go:117] "RemoveContainer" containerID="630b6dd77409ba7373ac305ac878dd1cc21f7e30cb20c62c26a34c0ac8ea6657" Mar 18 14:35:21 crc kubenswrapper[4756]: E0318 14:35:21.621849 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"630b6dd77409ba7373ac305ac878dd1cc21f7e30cb20c62c26a34c0ac8ea6657\": container with ID starting with 630b6dd77409ba7373ac305ac878dd1cc21f7e30cb20c62c26a34c0ac8ea6657 not found: ID does not exist" containerID="630b6dd77409ba7373ac305ac878dd1cc21f7e30cb20c62c26a34c0ac8ea6657" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.621902 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"630b6dd77409ba7373ac305ac878dd1cc21f7e30cb20c62c26a34c0ac8ea6657"} err="failed to get container status \"630b6dd77409ba7373ac305ac878dd1cc21f7e30cb20c62c26a34c0ac8ea6657\": rpc error: code = NotFound desc = could not find container \"630b6dd77409ba7373ac305ac878dd1cc21f7e30cb20c62c26a34c0ac8ea6657\": container with ID starting with 630b6dd77409ba7373ac305ac878dd1cc21f7e30cb20c62c26a34c0ac8ea6657 not found: ID does not exist" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.621944 4756 scope.go:117] "RemoveContainer" containerID="556f3e42d33391f2d646a60cb00222706f325c2efd54aae92a4b60312cdbf32f" Mar 18 14:35:21 crc kubenswrapper[4756]: E0318 14:35:21.622433 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"556f3e42d33391f2d646a60cb00222706f325c2efd54aae92a4b60312cdbf32f\": container with ID starting with 556f3e42d33391f2d646a60cb00222706f325c2efd54aae92a4b60312cdbf32f not found: ID does not exist" containerID="556f3e42d33391f2d646a60cb00222706f325c2efd54aae92a4b60312cdbf32f" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.622475 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556f3e42d33391f2d646a60cb00222706f325c2efd54aae92a4b60312cdbf32f"} err="failed to get container status \"556f3e42d33391f2d646a60cb00222706f325c2efd54aae92a4b60312cdbf32f\": rpc error: code = NotFound desc = could not find container \"556f3e42d33391f2d646a60cb00222706f325c2efd54aae92a4b60312cdbf32f\": container with ID starting with 556f3e42d33391f2d646a60cb00222706f325c2efd54aae92a4b60312cdbf32f not found: ID does not exist" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.622504 4756 scope.go:117] "RemoveContainer" containerID="653aa355a25867f2914324c2f547514e791ee9afe11c45d13468d74ca20d7e91" Mar 18 14:35:21 crc kubenswrapper[4756]: E0318 14:35:21.623943 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"653aa355a25867f2914324c2f547514e791ee9afe11c45d13468d74ca20d7e91\": container with ID starting with 653aa355a25867f2914324c2f547514e791ee9afe11c45d13468d74ca20d7e91 not found: ID does not exist" containerID="653aa355a25867f2914324c2f547514e791ee9afe11c45d13468d74ca20d7e91" Mar 18 14:35:21 crc kubenswrapper[4756]: I0318 14:35:21.624037 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653aa355a25867f2914324c2f547514e791ee9afe11c45d13468d74ca20d7e91"} err="failed to get container status \"653aa355a25867f2914324c2f547514e791ee9afe11c45d13468d74ca20d7e91\": rpc error: code = NotFound desc = could not find container \"653aa355a25867f2914324c2f547514e791ee9afe11c45d13468d74ca20d7e91\": container with ID starting with 653aa355a25867f2914324c2f547514e791ee9afe11c45d13468d74ca20d7e91 not found: ID does not exist" Mar 18 14:35:23 crc kubenswrapper[4756]: I0318 14:35:23.339687 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="211f866c-5d09-415c-ae4a-b579240dea79" path="/var/lib/kubelet/pods/211f866c-5d09-415c-ae4a-b579240dea79/volumes" Mar 18 14:35:32 crc kubenswrapper[4756]: I0318 14:35:32.734315 4756 scope.go:117] "RemoveContainer" containerID="c4f0ba2ad926a259b9adfdf94ebd5d19b559540676b8757e2e18ee3567425782" Mar 18 14:35:44 crc kubenswrapper[4756]: I0318 14:35:44.775944 4756 generic.go:334] "Generic (PLEG): container finished" podID="eda86959-5a56-443e-b21e-9d0dcd73e6b6" containerID="b3f99537810f2b9419119d42256407f4a984f073ef8b83ff4e6177108210a94a" exitCode=0 Mar 18 14:35:44 crc kubenswrapper[4756]: I0318 14:35:44.776310 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" event={"ID":"eda86959-5a56-443e-b21e-9d0dcd73e6b6","Type":"ContainerDied","Data":"b3f99537810f2b9419119d42256407f4a984f073ef8b83ff4e6177108210a94a"} Mar 18 14:35:44 crc kubenswrapper[4756]: I0318 14:35:44.976354 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f58gh"] Mar 18 14:35:44 crc kubenswrapper[4756]: E0318 14:35:44.977161 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211f866c-5d09-415c-ae4a-b579240dea79" containerName="extract-content" Mar 18 14:35:44 crc kubenswrapper[4756]: I0318 14:35:44.977189 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="211f866c-5d09-415c-ae4a-b579240dea79" containerName="extract-content" Mar 18 14:35:44 crc kubenswrapper[4756]: E0318 14:35:44.977246 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211f866c-5d09-415c-ae4a-b579240dea79" containerName="registry-server" Mar 18 14:35:44 crc kubenswrapper[4756]: I0318 14:35:44.977259 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="211f866c-5d09-415c-ae4a-b579240dea79" containerName="registry-server" Mar 18 14:35:44 crc kubenswrapper[4756]: E0318 14:35:44.977289 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211f866c-5d09-415c-ae4a-b579240dea79" containerName="extract-utilities" Mar 18 14:35:44 crc kubenswrapper[4756]: I0318 14:35:44.977303 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="211f866c-5d09-415c-ae4a-b579240dea79" containerName="extract-utilities" Mar 18 14:35:44 crc kubenswrapper[4756]: I0318 14:35:44.977668 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="211f866c-5d09-415c-ae4a-b579240dea79" containerName="registry-server" Mar 18 14:35:44 crc kubenswrapper[4756]: I0318 14:35:44.980445 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:44 crc kubenswrapper[4756]: I0318 14:35:44.995580 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f58gh"] Mar 18 14:35:45 crc kubenswrapper[4756]: I0318 14:35:45.088310 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e12e272-418f-4e1a-bbee-fe54eee626aa-catalog-content\") pod \"community-operators-f58gh\" (UID: \"6e12e272-418f-4e1a-bbee-fe54eee626aa\") " pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:45 crc kubenswrapper[4756]: I0318 14:35:45.088376 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f547n\" (UniqueName: \"kubernetes.io/projected/6e12e272-418f-4e1a-bbee-fe54eee626aa-kube-api-access-f547n\") pod \"community-operators-f58gh\" (UID: \"6e12e272-418f-4e1a-bbee-fe54eee626aa\") " pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:45 crc kubenswrapper[4756]: I0318 14:35:45.088406 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e12e272-418f-4e1a-bbee-fe54eee626aa-utilities\") pod \"community-operators-f58gh\" (UID: \"6e12e272-418f-4e1a-bbee-fe54eee626aa\") " pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:45 crc kubenswrapper[4756]: I0318 14:35:45.192142 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e12e272-418f-4e1a-bbee-fe54eee626aa-utilities\") pod \"community-operators-f58gh\" (UID: \"6e12e272-418f-4e1a-bbee-fe54eee626aa\") " pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:45 crc kubenswrapper[4756]: I0318 14:35:45.192476 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e12e272-418f-4e1a-bbee-fe54eee626aa-catalog-content\") pod \"community-operators-f58gh\" (UID: \"6e12e272-418f-4e1a-bbee-fe54eee626aa\") " pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:45 crc kubenswrapper[4756]: I0318 14:35:45.192547 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f547n\" (UniqueName: \"kubernetes.io/projected/6e12e272-418f-4e1a-bbee-fe54eee626aa-kube-api-access-f547n\") pod \"community-operators-f58gh\" (UID: \"6e12e272-418f-4e1a-bbee-fe54eee626aa\") " pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:45 crc kubenswrapper[4756]: I0318 14:35:45.192792 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e12e272-418f-4e1a-bbee-fe54eee626aa-utilities\") pod \"community-operators-f58gh\" (UID: \"6e12e272-418f-4e1a-bbee-fe54eee626aa\") " pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:45 crc kubenswrapper[4756]: I0318 14:35:45.195537 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e12e272-418f-4e1a-bbee-fe54eee626aa-catalog-content\") pod \"community-operators-f58gh\" (UID: \"6e12e272-418f-4e1a-bbee-fe54eee626aa\") " pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:45 crc kubenswrapper[4756]: I0318 14:35:45.221615 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f547n\" (UniqueName: \"kubernetes.io/projected/6e12e272-418f-4e1a-bbee-fe54eee626aa-kube-api-access-f547n\") pod \"community-operators-f58gh\" (UID: \"6e12e272-418f-4e1a-bbee-fe54eee626aa\") " pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:45 crc kubenswrapper[4756]: I0318 14:35:45.317978 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:45 crc kubenswrapper[4756]: I0318 14:35:45.765530 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f58gh"] Mar 18 14:35:45 crc kubenswrapper[4756]: I0318 14:35:45.791240 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f58gh" event={"ID":"6e12e272-418f-4e1a-bbee-fe54eee626aa","Type":"ContainerStarted","Data":"da269ad3d1411dd1dc14a476df1bf5bd6ae174fbc6cefa8d86760f282165e749"} Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.139540 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.323789 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-neutron-metadata-combined-ca-bundle\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.323955 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.323982 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-libvirt-combined-ca-bundle\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.324043 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.324080 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-bootstrap-combined-ca-bundle\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.324105 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-ssh-key-openstack-edpm-ipam\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.324159 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-telemetry-combined-ca-bundle\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.324182 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.324218 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-nova-combined-ca-bundle\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.324264 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm24t\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-kube-api-access-mm24t\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.324289 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-repo-setup-combined-ca-bundle\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.324341 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.324625 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-ovn-combined-ca-bundle\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.325759 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-inventory\") pod \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\" (UID: \"eda86959-5a56-443e-b21e-9d0dcd73e6b6\") " Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.330921 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.331009 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.331050 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-kube-api-access-mm24t" (OuterVolumeSpecName: "kube-api-access-mm24t") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "kube-api-access-mm24t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.333366 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.333726 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.334034 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.336223 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.336270 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.336319 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.336711 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.336879 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.338287 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.358815 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-inventory" (OuterVolumeSpecName: "inventory") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.366759 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eda86959-5a56-443e-b21e-9d0dcd73e6b6" (UID: "eda86959-5a56-443e-b21e-9d0dcd73e6b6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.428604 4756 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.428967 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.428984 4756 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.428999 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.429013 4756 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.429026 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm24t\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-kube-api-access-mm24t\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.429037 4756 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.429051 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.429065 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.429080 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.429091 4756 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.429104 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.429142 4756 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda86959-5a56-443e-b21e-9d0dcd73e6b6-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.429161 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eda86959-5a56-443e-b21e-9d0dcd73e6b6-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.804021 4756 generic.go:334] "Generic (PLEG): container finished" podID="6e12e272-418f-4e1a-bbee-fe54eee626aa" containerID="b1fb361e99efad246e580b6d61f145d2620fc77a139b19539bf378c38e21690a" exitCode=0 Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.804085 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f58gh" event={"ID":"6e12e272-418f-4e1a-bbee-fe54eee626aa","Type":"ContainerDied","Data":"b1fb361e99efad246e580b6d61f145d2620fc77a139b19539bf378c38e21690a"} Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.806516 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" event={"ID":"eda86959-5a56-443e-b21e-9d0dcd73e6b6","Type":"ContainerDied","Data":"22fbeddabe0473c48e421e3e16af98dea7822a73768ba74cca00e9b6766f1341"} Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.806555 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22fbeddabe0473c48e421e3e16af98dea7822a73768ba74cca00e9b6766f1341" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.806579 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.942072 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn"] Mar 18 14:35:46 crc kubenswrapper[4756]: E0318 14:35:46.942924 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda86959-5a56-443e-b21e-9d0dcd73e6b6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.943092 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda86959-5a56-443e-b21e-9d0dcd73e6b6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.943634 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda86959-5a56-443e-b21e-9d0dcd73e6b6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.944777 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.947294 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.947538 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.948630 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.949066 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.949269 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 18 14:35:46 crc kubenswrapper[4756]: I0318 14:35:46.955947 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn"] Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.151640 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnzv8\" (UniqueName: \"kubernetes.io/projected/b3d1261b-4146-4fc4-baa5-79ac98704bcd-kube-api-access-xnzv8\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.151752 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.151990 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.152279 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.152368 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.254251 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnzv8\" (UniqueName: \"kubernetes.io/projected/b3d1261b-4146-4fc4-baa5-79ac98704bcd-kube-api-access-xnzv8\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.254313 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.254382 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.254414 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.254454 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.255691 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.262456 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.263206 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.264683 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.273984 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnzv8\" (UniqueName: \"kubernetes.io/projected/b3d1261b-4146-4fc4-baa5-79ac98704bcd-kube-api-access-xnzv8\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b68mn\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:47 crc kubenswrapper[4756]: I0318 14:35:47.566653 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:35:48 crc kubenswrapper[4756]: I0318 14:35:48.092339 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn"] Mar 18 14:35:48 crc kubenswrapper[4756]: W0318 14:35:48.093224 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3d1261b_4146_4fc4_baa5_79ac98704bcd.slice/crio-1a48015da914df1330fdc79dc13bf15497a90ffebb4603fa151a88d0ba4c3f40 WatchSource:0}: Error finding container 1a48015da914df1330fdc79dc13bf15497a90ffebb4603fa151a88d0ba4c3f40: Status 404 returned error can't find the container with id 1a48015da914df1330fdc79dc13bf15497a90ffebb4603fa151a88d0ba4c3f40 Mar 18 14:35:48 crc kubenswrapper[4756]: I0318 14:35:48.832307 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" event={"ID":"b3d1261b-4146-4fc4-baa5-79ac98704bcd","Type":"ContainerStarted","Data":"1a48015da914df1330fdc79dc13bf15497a90ffebb4603fa151a88d0ba4c3f40"} Mar 18 14:35:48 crc kubenswrapper[4756]: I0318 14:35:48.834170 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f58gh" event={"ID":"6e12e272-418f-4e1a-bbee-fe54eee626aa","Type":"ContainerStarted","Data":"2118bd93d979fa1a27c4eb5fb34f9a73bf84bdf8731f4a5771c63e515196efbf"} Mar 18 14:35:49 crc kubenswrapper[4756]: I0318 14:35:49.843892 4756 generic.go:334] "Generic (PLEG): container finished" podID="6e12e272-418f-4e1a-bbee-fe54eee626aa" containerID="2118bd93d979fa1a27c4eb5fb34f9a73bf84bdf8731f4a5771c63e515196efbf" exitCode=0 Mar 18 14:35:49 crc kubenswrapper[4756]: I0318 14:35:49.844139 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f58gh" event={"ID":"6e12e272-418f-4e1a-bbee-fe54eee626aa","Type":"ContainerDied","Data":"2118bd93d979fa1a27c4eb5fb34f9a73bf84bdf8731f4a5771c63e515196efbf"} Mar 18 14:35:49 crc kubenswrapper[4756]: I0318 14:35:49.846604 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" event={"ID":"b3d1261b-4146-4fc4-baa5-79ac98704bcd","Type":"ContainerStarted","Data":"99c9cca5e96f9b330fa1bcfdba3d2cd655eed8546214c61354ac7136ab84659b"} Mar 18 14:35:49 crc kubenswrapper[4756]: I0318 14:35:49.887912 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" podStartSLOduration=3.401599236 podStartE2EDuration="3.887896404s" podCreationTimestamp="2026-03-18 14:35:46 +0000 UTC" firstStartedPulling="2026-03-18 14:35:48.095861502 +0000 UTC m=+2149.410279477" lastFinishedPulling="2026-03-18 14:35:48.58215866 +0000 UTC m=+2149.896576645" observedRunningTime="2026-03-18 14:35:49.8788438 +0000 UTC m=+2151.193261775" watchObservedRunningTime="2026-03-18 14:35:49.887896404 +0000 UTC m=+2151.202314379" Mar 18 14:35:50 crc kubenswrapper[4756]: I0318 14:35:50.859868 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f58gh" event={"ID":"6e12e272-418f-4e1a-bbee-fe54eee626aa","Type":"ContainerStarted","Data":"2ad42f103df9de558256d64663426926d670d85812c73863d57a5bed5776c2a3"} Mar 18 14:35:50 crc kubenswrapper[4756]: I0318 14:35:50.882159 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f58gh" podStartSLOduration=3.429647935 podStartE2EDuration="6.882112617s" podCreationTimestamp="2026-03-18 14:35:44 +0000 UTC" firstStartedPulling="2026-03-18 14:35:46.80720475 +0000 UTC m=+2148.121622735" lastFinishedPulling="2026-03-18 14:35:50.259669442 +0000 UTC m=+2151.574087417" observedRunningTime="2026-03-18 14:35:50.876324941 +0000 UTC m=+2152.190743076" watchObservedRunningTime="2026-03-18 14:35:50.882112617 +0000 UTC m=+2152.196530612" Mar 18 14:35:55 crc kubenswrapper[4756]: I0318 14:35:55.330912 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:55 crc kubenswrapper[4756]: I0318 14:35:55.331300 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:55 crc kubenswrapper[4756]: I0318 14:35:55.390987 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:55 crc kubenswrapper[4756]: I0318 14:35:55.994430 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:56 crc kubenswrapper[4756]: I0318 14:35:56.072659 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f58gh"] Mar 18 14:35:57 crc kubenswrapper[4756]: I0318 14:35:57.941925 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f58gh" podUID="6e12e272-418f-4e1a-bbee-fe54eee626aa" containerName="registry-server" containerID="cri-o://2ad42f103df9de558256d64663426926d670d85812c73863d57a5bed5776c2a3" gracePeriod=2 Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.491440 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.625373 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e12e272-418f-4e1a-bbee-fe54eee626aa-catalog-content\") pod \"6e12e272-418f-4e1a-bbee-fe54eee626aa\" (UID: \"6e12e272-418f-4e1a-bbee-fe54eee626aa\") " Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.625480 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e12e272-418f-4e1a-bbee-fe54eee626aa-utilities\") pod \"6e12e272-418f-4e1a-bbee-fe54eee626aa\" (UID: \"6e12e272-418f-4e1a-bbee-fe54eee626aa\") " Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.625675 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f547n\" (UniqueName: \"kubernetes.io/projected/6e12e272-418f-4e1a-bbee-fe54eee626aa-kube-api-access-f547n\") pod \"6e12e272-418f-4e1a-bbee-fe54eee626aa\" (UID: \"6e12e272-418f-4e1a-bbee-fe54eee626aa\") " Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.626538 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e12e272-418f-4e1a-bbee-fe54eee626aa-utilities" (OuterVolumeSpecName: "utilities") pod "6e12e272-418f-4e1a-bbee-fe54eee626aa" (UID: "6e12e272-418f-4e1a-bbee-fe54eee626aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.634361 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e12e272-418f-4e1a-bbee-fe54eee626aa-kube-api-access-f547n" (OuterVolumeSpecName: "kube-api-access-f547n") pod "6e12e272-418f-4e1a-bbee-fe54eee626aa" (UID: "6e12e272-418f-4e1a-bbee-fe54eee626aa"). InnerVolumeSpecName "kube-api-access-f547n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.729766 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e12e272-418f-4e1a-bbee-fe54eee626aa-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.729829 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f547n\" (UniqueName: \"kubernetes.io/projected/6e12e272-418f-4e1a-bbee-fe54eee626aa-kube-api-access-f547n\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.957876 4756 generic.go:334] "Generic (PLEG): container finished" podID="6e12e272-418f-4e1a-bbee-fe54eee626aa" containerID="2ad42f103df9de558256d64663426926d670d85812c73863d57a5bed5776c2a3" exitCode=0 Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.957950 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f58gh" event={"ID":"6e12e272-418f-4e1a-bbee-fe54eee626aa","Type":"ContainerDied","Data":"2ad42f103df9de558256d64663426926d670d85812c73863d57a5bed5776c2a3"} Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.957995 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f58gh" event={"ID":"6e12e272-418f-4e1a-bbee-fe54eee626aa","Type":"ContainerDied","Data":"da269ad3d1411dd1dc14a476df1bf5bd6ae174fbc6cefa8d86760f282165e749"} Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.958048 4756 scope.go:117] "RemoveContainer" containerID="2ad42f103df9de558256d64663426926d670d85812c73863d57a5bed5776c2a3" Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.958556 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f58gh" Mar 18 14:35:58 crc kubenswrapper[4756]: I0318 14:35:58.985921 4756 scope.go:117] "RemoveContainer" containerID="2118bd93d979fa1a27c4eb5fb34f9a73bf84bdf8731f4a5771c63e515196efbf" Mar 18 14:35:59 crc kubenswrapper[4756]: I0318 14:35:59.007626 4756 scope.go:117] "RemoveContainer" containerID="b1fb361e99efad246e580b6d61f145d2620fc77a139b19539bf378c38e21690a" Mar 18 14:35:59 crc kubenswrapper[4756]: I0318 14:35:59.020324 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e12e272-418f-4e1a-bbee-fe54eee626aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e12e272-418f-4e1a-bbee-fe54eee626aa" (UID: "6e12e272-418f-4e1a-bbee-fe54eee626aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:35:59 crc kubenswrapper[4756]: I0318 14:35:59.037031 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e12e272-418f-4e1a-bbee-fe54eee626aa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:59 crc kubenswrapper[4756]: I0318 14:35:59.062969 4756 scope.go:117] "RemoveContainer" containerID="2ad42f103df9de558256d64663426926d670d85812c73863d57a5bed5776c2a3" Mar 18 14:35:59 crc kubenswrapper[4756]: E0318 14:35:59.063651 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ad42f103df9de558256d64663426926d670d85812c73863d57a5bed5776c2a3\": container with ID starting with 2ad42f103df9de558256d64663426926d670d85812c73863d57a5bed5776c2a3 not found: ID does not exist" containerID="2ad42f103df9de558256d64663426926d670d85812c73863d57a5bed5776c2a3" Mar 18 14:35:59 crc kubenswrapper[4756]: I0318 14:35:59.063860 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad42f103df9de558256d64663426926d670d85812c73863d57a5bed5776c2a3"} err="failed to get container status \"2ad42f103df9de558256d64663426926d670d85812c73863d57a5bed5776c2a3\": rpc error: code = NotFound desc = could not find container \"2ad42f103df9de558256d64663426926d670d85812c73863d57a5bed5776c2a3\": container with ID starting with 2ad42f103df9de558256d64663426926d670d85812c73863d57a5bed5776c2a3 not found: ID does not exist" Mar 18 14:35:59 crc kubenswrapper[4756]: I0318 14:35:59.064002 4756 scope.go:117] "RemoveContainer" containerID="2118bd93d979fa1a27c4eb5fb34f9a73bf84bdf8731f4a5771c63e515196efbf" Mar 18 14:35:59 crc kubenswrapper[4756]: E0318 14:35:59.064563 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2118bd93d979fa1a27c4eb5fb34f9a73bf84bdf8731f4a5771c63e515196efbf\": container with ID starting with 2118bd93d979fa1a27c4eb5fb34f9a73bf84bdf8731f4a5771c63e515196efbf not found: ID does not exist" containerID="2118bd93d979fa1a27c4eb5fb34f9a73bf84bdf8731f4a5771c63e515196efbf" Mar 18 14:35:59 crc kubenswrapper[4756]: I0318 14:35:59.064703 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2118bd93d979fa1a27c4eb5fb34f9a73bf84bdf8731f4a5771c63e515196efbf"} err="failed to get container status \"2118bd93d979fa1a27c4eb5fb34f9a73bf84bdf8731f4a5771c63e515196efbf\": rpc error: code = NotFound desc = could not find container \"2118bd93d979fa1a27c4eb5fb34f9a73bf84bdf8731f4a5771c63e515196efbf\": container with ID starting with 2118bd93d979fa1a27c4eb5fb34f9a73bf84bdf8731f4a5771c63e515196efbf not found: ID does not exist" Mar 18 14:35:59 crc kubenswrapper[4756]: I0318 14:35:59.064820 4756 scope.go:117] "RemoveContainer" containerID="b1fb361e99efad246e580b6d61f145d2620fc77a139b19539bf378c38e21690a" Mar 18 14:35:59 crc kubenswrapper[4756]: E0318 14:35:59.065197 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1fb361e99efad246e580b6d61f145d2620fc77a139b19539bf378c38e21690a\": container with ID starting with b1fb361e99efad246e580b6d61f145d2620fc77a139b19539bf378c38e21690a not found: ID does not exist" containerID="b1fb361e99efad246e580b6d61f145d2620fc77a139b19539bf378c38e21690a" Mar 18 14:35:59 crc kubenswrapper[4756]: I0318 14:35:59.065331 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1fb361e99efad246e580b6d61f145d2620fc77a139b19539bf378c38e21690a"} err="failed to get container status \"b1fb361e99efad246e580b6d61f145d2620fc77a139b19539bf378c38e21690a\": rpc error: code = NotFound desc = could not find container \"b1fb361e99efad246e580b6d61f145d2620fc77a139b19539bf378c38e21690a\": container with ID starting with b1fb361e99efad246e580b6d61f145d2620fc77a139b19539bf378c38e21690a not found: ID does not exist" Mar 18 14:35:59 crc kubenswrapper[4756]: I0318 14:35:59.310567 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f58gh"] Mar 18 14:35:59 crc kubenswrapper[4756]: I0318 14:35:59.329573 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f58gh"] Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.143575 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564076-nnvk9"] Mar 18 14:36:00 crc kubenswrapper[4756]: E0318 14:36:00.144413 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e12e272-418f-4e1a-bbee-fe54eee626aa" containerName="extract-utilities" Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.144430 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e12e272-418f-4e1a-bbee-fe54eee626aa" containerName="extract-utilities" Mar 18 14:36:00 crc kubenswrapper[4756]: E0318 14:36:00.144449 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e12e272-418f-4e1a-bbee-fe54eee626aa" containerName="registry-server" Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.144459 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e12e272-418f-4e1a-bbee-fe54eee626aa" containerName="registry-server" Mar 18 14:36:00 crc kubenswrapper[4756]: E0318 14:36:00.144481 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e12e272-418f-4e1a-bbee-fe54eee626aa" containerName="extract-content" Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.144489 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e12e272-418f-4e1a-bbee-fe54eee626aa" containerName="extract-content" Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.144748 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e12e272-418f-4e1a-bbee-fe54eee626aa" containerName="registry-server" Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.145790 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564076-nnvk9" Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.148714 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.153736 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.154030 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.156351 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564076-nnvk9"] Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.263739 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4v8b\" (UniqueName: \"kubernetes.io/projected/5f579b3d-c832-4988-895b-5545dc5c857d-kube-api-access-p4v8b\") pod \"auto-csr-approver-29564076-nnvk9\" (UID: \"5f579b3d-c832-4988-895b-5545dc5c857d\") " pod="openshift-infra/auto-csr-approver-29564076-nnvk9" Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.366515 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4v8b\" (UniqueName: \"kubernetes.io/projected/5f579b3d-c832-4988-895b-5545dc5c857d-kube-api-access-p4v8b\") pod \"auto-csr-approver-29564076-nnvk9\" (UID: \"5f579b3d-c832-4988-895b-5545dc5c857d\") " pod="openshift-infra/auto-csr-approver-29564076-nnvk9" Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.393955 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4v8b\" (UniqueName: \"kubernetes.io/projected/5f579b3d-c832-4988-895b-5545dc5c857d-kube-api-access-p4v8b\") pod \"auto-csr-approver-29564076-nnvk9\" (UID: \"5f579b3d-c832-4988-895b-5545dc5c857d\") " pod="openshift-infra/auto-csr-approver-29564076-nnvk9" Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.470763 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564076-nnvk9" Mar 18 14:36:00 crc kubenswrapper[4756]: W0318 14:36:00.942951 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f579b3d_c832_4988_895b_5545dc5c857d.slice/crio-3275ce845f439861ec48e9ec68c53e445bbe508412b623845ee97ef750aaa4bb WatchSource:0}: Error finding container 3275ce845f439861ec48e9ec68c53e445bbe508412b623845ee97ef750aaa4bb: Status 404 returned error can't find the container with id 3275ce845f439861ec48e9ec68c53e445bbe508412b623845ee97ef750aaa4bb Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.945000 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564076-nnvk9"] Mar 18 14:36:00 crc kubenswrapper[4756]: I0318 14:36:00.982794 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564076-nnvk9" event={"ID":"5f579b3d-c832-4988-895b-5545dc5c857d","Type":"ContainerStarted","Data":"3275ce845f439861ec48e9ec68c53e445bbe508412b623845ee97ef750aaa4bb"} Mar 18 14:36:01 crc kubenswrapper[4756]: I0318 14:36:01.332353 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e12e272-418f-4e1a-bbee-fe54eee626aa" path="/var/lib/kubelet/pods/6e12e272-418f-4e1a-bbee-fe54eee626aa/volumes" Mar 18 14:36:03 crc kubenswrapper[4756]: I0318 14:36:03.012353 4756 generic.go:334] "Generic (PLEG): container finished" podID="5f579b3d-c832-4988-895b-5545dc5c857d" containerID="610edfeeee407244e009b8349bd088df4b0075a56274fa3bec6087e147e8e882" exitCode=0 Mar 18 14:36:03 crc kubenswrapper[4756]: I0318 14:36:03.012431 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564076-nnvk9" event={"ID":"5f579b3d-c832-4988-895b-5545dc5c857d","Type":"ContainerDied","Data":"610edfeeee407244e009b8349bd088df4b0075a56274fa3bec6087e147e8e882"} Mar 18 14:36:04 crc kubenswrapper[4756]: I0318 14:36:04.415259 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564076-nnvk9" Mar 18 14:36:04 crc kubenswrapper[4756]: I0318 14:36:04.557654 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4v8b\" (UniqueName: \"kubernetes.io/projected/5f579b3d-c832-4988-895b-5545dc5c857d-kube-api-access-p4v8b\") pod \"5f579b3d-c832-4988-895b-5545dc5c857d\" (UID: \"5f579b3d-c832-4988-895b-5545dc5c857d\") " Mar 18 14:36:04 crc kubenswrapper[4756]: I0318 14:36:04.562994 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f579b3d-c832-4988-895b-5545dc5c857d-kube-api-access-p4v8b" (OuterVolumeSpecName: "kube-api-access-p4v8b") pod "5f579b3d-c832-4988-895b-5545dc5c857d" (UID: "5f579b3d-c832-4988-895b-5545dc5c857d"). InnerVolumeSpecName "kube-api-access-p4v8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:36:04 crc kubenswrapper[4756]: I0318 14:36:04.660159 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4v8b\" (UniqueName: \"kubernetes.io/projected/5f579b3d-c832-4988-895b-5545dc5c857d-kube-api-access-p4v8b\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:05 crc kubenswrapper[4756]: I0318 14:36:05.036349 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564076-nnvk9" event={"ID":"5f579b3d-c832-4988-895b-5545dc5c857d","Type":"ContainerDied","Data":"3275ce845f439861ec48e9ec68c53e445bbe508412b623845ee97ef750aaa4bb"} Mar 18 14:36:05 crc kubenswrapper[4756]: I0318 14:36:05.036406 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3275ce845f439861ec48e9ec68c53e445bbe508412b623845ee97ef750aaa4bb" Mar 18 14:36:05 crc kubenswrapper[4756]: I0318 14:36:05.036411 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564076-nnvk9" Mar 18 14:36:05 crc kubenswrapper[4756]: I0318 14:36:05.499456 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564070-6fr7g"] Mar 18 14:36:05 crc kubenswrapper[4756]: I0318 14:36:05.511192 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564070-6fr7g"] Mar 18 14:36:07 crc kubenswrapper[4756]: I0318 14:36:07.338651 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="437200a1-9595-448c-8f6f-9612f8ca2e2e" path="/var/lib/kubelet/pods/437200a1-9595-448c-8f6f-9612f8ca2e2e/volumes" Mar 18 14:36:11 crc kubenswrapper[4756]: I0318 14:36:11.962287 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mvrzv"] Mar 18 14:36:11 crc kubenswrapper[4756]: E0318 14:36:11.963333 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f579b3d-c832-4988-895b-5545dc5c857d" containerName="oc" Mar 18 14:36:11 crc kubenswrapper[4756]: I0318 14:36:11.963346 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f579b3d-c832-4988-895b-5545dc5c857d" containerName="oc" Mar 18 14:36:11 crc kubenswrapper[4756]: I0318 14:36:11.963550 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f579b3d-c832-4988-895b-5545dc5c857d" containerName="oc" Mar 18 14:36:11 crc kubenswrapper[4756]: I0318 14:36:11.965140 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:11 crc kubenswrapper[4756]: I0318 14:36:11.980323 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mvrzv"] Mar 18 14:36:12 crc kubenswrapper[4756]: I0318 14:36:12.008030 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbp5b\" (UniqueName: \"kubernetes.io/projected/ce9d333c-02e3-4961-8d6d-ac10e68424f6-kube-api-access-vbp5b\") pod \"certified-operators-mvrzv\" (UID: \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\") " pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:12 crc kubenswrapper[4756]: I0318 14:36:12.008090 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce9d333c-02e3-4961-8d6d-ac10e68424f6-catalog-content\") pod \"certified-operators-mvrzv\" (UID: \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\") " pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:12 crc kubenswrapper[4756]: I0318 14:36:12.008444 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce9d333c-02e3-4961-8d6d-ac10e68424f6-utilities\") pod \"certified-operators-mvrzv\" (UID: \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\") " pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:12 crc kubenswrapper[4756]: I0318 14:36:12.109605 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbp5b\" (UniqueName: \"kubernetes.io/projected/ce9d333c-02e3-4961-8d6d-ac10e68424f6-kube-api-access-vbp5b\") pod \"certified-operators-mvrzv\" (UID: \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\") " pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:12 crc kubenswrapper[4756]: I0318 14:36:12.109692 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce9d333c-02e3-4961-8d6d-ac10e68424f6-catalog-content\") pod \"certified-operators-mvrzv\" (UID: \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\") " pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:12 crc kubenswrapper[4756]: I0318 14:36:12.109869 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce9d333c-02e3-4961-8d6d-ac10e68424f6-utilities\") pod \"certified-operators-mvrzv\" (UID: \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\") " pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:12 crc kubenswrapper[4756]: I0318 14:36:12.110227 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce9d333c-02e3-4961-8d6d-ac10e68424f6-catalog-content\") pod \"certified-operators-mvrzv\" (UID: \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\") " pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:12 crc kubenswrapper[4756]: I0318 14:36:12.110358 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce9d333c-02e3-4961-8d6d-ac10e68424f6-utilities\") pod \"certified-operators-mvrzv\" (UID: \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\") " pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:12 crc kubenswrapper[4756]: I0318 14:36:12.136893 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbp5b\" (UniqueName: \"kubernetes.io/projected/ce9d333c-02e3-4961-8d6d-ac10e68424f6-kube-api-access-vbp5b\") pod \"certified-operators-mvrzv\" (UID: \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\") " pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:12 crc kubenswrapper[4756]: I0318 14:36:12.288406 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:13 crc kubenswrapper[4756]: I0318 14:36:13.663288 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mvrzv"] Mar 18 14:36:14 crc kubenswrapper[4756]: I0318 14:36:14.675277 4756 generic.go:334] "Generic (PLEG): container finished" podID="ce9d333c-02e3-4961-8d6d-ac10e68424f6" containerID="82e51cd0e4529519cc80b70fb0f27bf7a95affb9c54f507ab5eb42114f033382" exitCode=0 Mar 18 14:36:14 crc kubenswrapper[4756]: I0318 14:36:14.675349 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvrzv" event={"ID":"ce9d333c-02e3-4961-8d6d-ac10e68424f6","Type":"ContainerDied","Data":"82e51cd0e4529519cc80b70fb0f27bf7a95affb9c54f507ab5eb42114f033382"} Mar 18 14:36:14 crc kubenswrapper[4756]: I0318 14:36:14.675607 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvrzv" event={"ID":"ce9d333c-02e3-4961-8d6d-ac10e68424f6","Type":"ContainerStarted","Data":"a99991b21e1026295fa7c6ae8d2ef7f5d136621bec9bf55b4d6f22bd574dcb9b"} Mar 18 14:36:15 crc kubenswrapper[4756]: I0318 14:36:15.046193 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-lxm7t"] Mar 18 14:36:15 crc kubenswrapper[4756]: I0318 14:36:15.057930 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-lxm7t"] Mar 18 14:36:15 crc kubenswrapper[4756]: I0318 14:36:15.330560 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2" path="/var/lib/kubelet/pods/6375bd8a-1cd5-4306-99ae-6fa6a8d07fc2/volumes" Mar 18 14:36:15 crc kubenswrapper[4756]: I0318 14:36:15.686499 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvrzv" event={"ID":"ce9d333c-02e3-4961-8d6d-ac10e68424f6","Type":"ContainerStarted","Data":"21b41c0fb1c7dc1a3d722b2d163092914108e2ef88c2ea80788bf78b0ca0d929"} Mar 18 14:36:17 crc kubenswrapper[4756]: I0318 14:36:17.707066 4756 generic.go:334] "Generic (PLEG): container finished" podID="ce9d333c-02e3-4961-8d6d-ac10e68424f6" containerID="21b41c0fb1c7dc1a3d722b2d163092914108e2ef88c2ea80788bf78b0ca0d929" exitCode=0 Mar 18 14:36:17 crc kubenswrapper[4756]: I0318 14:36:17.707163 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvrzv" event={"ID":"ce9d333c-02e3-4961-8d6d-ac10e68424f6","Type":"ContainerDied","Data":"21b41c0fb1c7dc1a3d722b2d163092914108e2ef88c2ea80788bf78b0ca0d929"} Mar 18 14:36:18 crc kubenswrapper[4756]: I0318 14:36:18.717991 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvrzv" event={"ID":"ce9d333c-02e3-4961-8d6d-ac10e68424f6","Type":"ContainerStarted","Data":"a4ae205803cfa2429ee389def0efa3a3feb3ea8f8daa0ab15939dfa50bd0550f"} Mar 18 14:36:18 crc kubenswrapper[4756]: I0318 14:36:18.737918 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mvrzv" podStartSLOduration=4.12532833 podStartE2EDuration="7.737897217s" podCreationTimestamp="2026-03-18 14:36:11 +0000 UTC" firstStartedPulling="2026-03-18 14:36:14.677449648 +0000 UTC m=+2175.991867623" lastFinishedPulling="2026-03-18 14:36:18.290018545 +0000 UTC m=+2179.604436510" observedRunningTime="2026-03-18 14:36:18.737029114 +0000 UTC m=+2180.051447089" watchObservedRunningTime="2026-03-18 14:36:18.737897217 +0000 UTC m=+2180.052315192" Mar 18 14:36:20 crc kubenswrapper[4756]: I0318 14:36:20.036107 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-djhlg"] Mar 18 14:36:20 crc kubenswrapper[4756]: I0318 14:36:20.047415 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-djhlg"] Mar 18 14:36:21 crc kubenswrapper[4756]: I0318 14:36:21.330251 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12455cd-3670-4081-8e78-d2088ac075cc" path="/var/lib/kubelet/pods/b12455cd-3670-4081-8e78-d2088ac075cc/volumes" Mar 18 14:36:22 crc kubenswrapper[4756]: I0318 14:36:22.289350 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:22 crc kubenswrapper[4756]: I0318 14:36:22.289704 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:22 crc kubenswrapper[4756]: I0318 14:36:22.368864 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:32 crc kubenswrapper[4756]: I0318 14:36:32.358337 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:32 crc kubenswrapper[4756]: I0318 14:36:32.426843 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mvrzv"] Mar 18 14:36:32 crc kubenswrapper[4756]: I0318 14:36:32.840954 4756 scope.go:117] "RemoveContainer" containerID="75adf43572dee2082d88455761472c1f390fa1535eaf69f33930b347c9f759cb" Mar 18 14:36:32 crc kubenswrapper[4756]: I0318 14:36:32.861089 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mvrzv" podUID="ce9d333c-02e3-4961-8d6d-ac10e68424f6" containerName="registry-server" containerID="cri-o://a4ae205803cfa2429ee389def0efa3a3feb3ea8f8daa0ab15939dfa50bd0550f" gracePeriod=2 Mar 18 14:36:32 crc kubenswrapper[4756]: I0318 14:36:32.878941 4756 scope.go:117] "RemoveContainer" containerID="dd068bd5d7c56f6e465654a61fd010916bdfaaaab2c4861d7bdc786d55e07e99" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.071240 4756 scope.go:117] "RemoveContainer" containerID="23a0f64851bd0910ace15275d32efbd0fc9768ae29b909441bfcf60b1f7ee25f" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.401987 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.564920 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce9d333c-02e3-4961-8d6d-ac10e68424f6-utilities\") pod \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\" (UID: \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\") " Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.565061 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbp5b\" (UniqueName: \"kubernetes.io/projected/ce9d333c-02e3-4961-8d6d-ac10e68424f6-kube-api-access-vbp5b\") pod \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\" (UID: \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\") " Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.565092 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce9d333c-02e3-4961-8d6d-ac10e68424f6-catalog-content\") pod \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\" (UID: \"ce9d333c-02e3-4961-8d6d-ac10e68424f6\") " Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.567053 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce9d333c-02e3-4961-8d6d-ac10e68424f6-utilities" (OuterVolumeSpecName: "utilities") pod "ce9d333c-02e3-4961-8d6d-ac10e68424f6" (UID: "ce9d333c-02e3-4961-8d6d-ac10e68424f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.577188 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce9d333c-02e3-4961-8d6d-ac10e68424f6-kube-api-access-vbp5b" (OuterVolumeSpecName: "kube-api-access-vbp5b") pod "ce9d333c-02e3-4961-8d6d-ac10e68424f6" (UID: "ce9d333c-02e3-4961-8d6d-ac10e68424f6"). InnerVolumeSpecName "kube-api-access-vbp5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.623543 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce9d333c-02e3-4961-8d6d-ac10e68424f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce9d333c-02e3-4961-8d6d-ac10e68424f6" (UID: "ce9d333c-02e3-4961-8d6d-ac10e68424f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.667156 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce9d333c-02e3-4961-8d6d-ac10e68424f6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.667198 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbp5b\" (UniqueName: \"kubernetes.io/projected/ce9d333c-02e3-4961-8d6d-ac10e68424f6-kube-api-access-vbp5b\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.667213 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce9d333c-02e3-4961-8d6d-ac10e68424f6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.873764 4756 generic.go:334] "Generic (PLEG): container finished" podID="ce9d333c-02e3-4961-8d6d-ac10e68424f6" containerID="a4ae205803cfa2429ee389def0efa3a3feb3ea8f8daa0ab15939dfa50bd0550f" exitCode=0 Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.873820 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvrzv" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.873858 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvrzv" event={"ID":"ce9d333c-02e3-4961-8d6d-ac10e68424f6","Type":"ContainerDied","Data":"a4ae205803cfa2429ee389def0efa3a3feb3ea8f8daa0ab15939dfa50bd0550f"} Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.874236 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvrzv" event={"ID":"ce9d333c-02e3-4961-8d6d-ac10e68424f6","Type":"ContainerDied","Data":"a99991b21e1026295fa7c6ae8d2ef7f5d136621bec9bf55b4d6f22bd574dcb9b"} Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.874259 4756 scope.go:117] "RemoveContainer" containerID="a4ae205803cfa2429ee389def0efa3a3feb3ea8f8daa0ab15939dfa50bd0550f" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.925469 4756 scope.go:117] "RemoveContainer" containerID="21b41c0fb1c7dc1a3d722b2d163092914108e2ef88c2ea80788bf78b0ca0d929" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.948534 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mvrzv"] Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.952488 4756 scope.go:117] "RemoveContainer" containerID="82e51cd0e4529519cc80b70fb0f27bf7a95affb9c54f507ab5eb42114f033382" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.960323 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mvrzv"] Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.992007 4756 scope.go:117] "RemoveContainer" containerID="a4ae205803cfa2429ee389def0efa3a3feb3ea8f8daa0ab15939dfa50bd0550f" Mar 18 14:36:33 crc kubenswrapper[4756]: E0318 14:36:33.992598 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ae205803cfa2429ee389def0efa3a3feb3ea8f8daa0ab15939dfa50bd0550f\": container with ID starting with a4ae205803cfa2429ee389def0efa3a3feb3ea8f8daa0ab15939dfa50bd0550f not found: ID does not exist" containerID="a4ae205803cfa2429ee389def0efa3a3feb3ea8f8daa0ab15939dfa50bd0550f" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.992639 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ae205803cfa2429ee389def0efa3a3feb3ea8f8daa0ab15939dfa50bd0550f"} err="failed to get container status \"a4ae205803cfa2429ee389def0efa3a3feb3ea8f8daa0ab15939dfa50bd0550f\": rpc error: code = NotFound desc = could not find container \"a4ae205803cfa2429ee389def0efa3a3feb3ea8f8daa0ab15939dfa50bd0550f\": container with ID starting with a4ae205803cfa2429ee389def0efa3a3feb3ea8f8daa0ab15939dfa50bd0550f not found: ID does not exist" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.992664 4756 scope.go:117] "RemoveContainer" containerID="21b41c0fb1c7dc1a3d722b2d163092914108e2ef88c2ea80788bf78b0ca0d929" Mar 18 14:36:33 crc kubenswrapper[4756]: E0318 14:36:33.993231 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21b41c0fb1c7dc1a3d722b2d163092914108e2ef88c2ea80788bf78b0ca0d929\": container with ID starting with 21b41c0fb1c7dc1a3d722b2d163092914108e2ef88c2ea80788bf78b0ca0d929 not found: ID does not exist" containerID="21b41c0fb1c7dc1a3d722b2d163092914108e2ef88c2ea80788bf78b0ca0d929" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.993280 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21b41c0fb1c7dc1a3d722b2d163092914108e2ef88c2ea80788bf78b0ca0d929"} err="failed to get container status \"21b41c0fb1c7dc1a3d722b2d163092914108e2ef88c2ea80788bf78b0ca0d929\": rpc error: code = NotFound desc = could not find container \"21b41c0fb1c7dc1a3d722b2d163092914108e2ef88c2ea80788bf78b0ca0d929\": container with ID starting with 21b41c0fb1c7dc1a3d722b2d163092914108e2ef88c2ea80788bf78b0ca0d929 not found: ID does not exist" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.993308 4756 scope.go:117] "RemoveContainer" containerID="82e51cd0e4529519cc80b70fb0f27bf7a95affb9c54f507ab5eb42114f033382" Mar 18 14:36:33 crc kubenswrapper[4756]: E0318 14:36:33.993594 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e51cd0e4529519cc80b70fb0f27bf7a95affb9c54f507ab5eb42114f033382\": container with ID starting with 82e51cd0e4529519cc80b70fb0f27bf7a95affb9c54f507ab5eb42114f033382 not found: ID does not exist" containerID="82e51cd0e4529519cc80b70fb0f27bf7a95affb9c54f507ab5eb42114f033382" Mar 18 14:36:33 crc kubenswrapper[4756]: I0318 14:36:33.993633 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e51cd0e4529519cc80b70fb0f27bf7a95affb9c54f507ab5eb42114f033382"} err="failed to get container status \"82e51cd0e4529519cc80b70fb0f27bf7a95affb9c54f507ab5eb42114f033382\": rpc error: code = NotFound desc = could not find container \"82e51cd0e4529519cc80b70fb0f27bf7a95affb9c54f507ab5eb42114f033382\": container with ID starting with 82e51cd0e4529519cc80b70fb0f27bf7a95affb9c54f507ab5eb42114f033382 not found: ID does not exist" Mar 18 14:36:35 crc kubenswrapper[4756]: I0318 14:36:35.331952 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce9d333c-02e3-4961-8d6d-ac10e68424f6" path="/var/lib/kubelet/pods/ce9d333c-02e3-4961-8d6d-ac10e68424f6/volumes" Mar 18 14:36:50 crc kubenswrapper[4756]: I0318 14:36:50.048965 4756 generic.go:334] "Generic (PLEG): container finished" podID="b3d1261b-4146-4fc4-baa5-79ac98704bcd" containerID="99c9cca5e96f9b330fa1bcfdba3d2cd655eed8546214c61354ac7136ab84659b" exitCode=0 Mar 18 14:36:50 crc kubenswrapper[4756]: I0318 14:36:50.049075 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" event={"ID":"b3d1261b-4146-4fc4-baa5-79ac98704bcd","Type":"ContainerDied","Data":"99c9cca5e96f9b330fa1bcfdba3d2cd655eed8546214c61354ac7136ab84659b"} Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.584968 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.761950 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ssh-key-openstack-edpm-ipam\") pod \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.762093 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-inventory\") pod \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.762234 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ovn-combined-ca-bundle\") pod \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.762285 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnzv8\" (UniqueName: \"kubernetes.io/projected/b3d1261b-4146-4fc4-baa5-79ac98704bcd-kube-api-access-xnzv8\") pod \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.762429 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ovncontroller-config-0\") pod \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\" (UID: \"b3d1261b-4146-4fc4-baa5-79ac98704bcd\") " Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.781003 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d1261b-4146-4fc4-baa5-79ac98704bcd-kube-api-access-xnzv8" (OuterVolumeSpecName: "kube-api-access-xnzv8") pod "b3d1261b-4146-4fc4-baa5-79ac98704bcd" (UID: "b3d1261b-4146-4fc4-baa5-79ac98704bcd"). InnerVolumeSpecName "kube-api-access-xnzv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.794577 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b3d1261b-4146-4fc4-baa5-79ac98704bcd" (UID: "b3d1261b-4146-4fc4-baa5-79ac98704bcd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.830058 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "b3d1261b-4146-4fc4-baa5-79ac98704bcd" (UID: "b3d1261b-4146-4fc4-baa5-79ac98704bcd"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.836302 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b3d1261b-4146-4fc4-baa5-79ac98704bcd" (UID: "b3d1261b-4146-4fc4-baa5-79ac98704bcd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.845159 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-inventory" (OuterVolumeSpecName: "inventory") pod "b3d1261b-4146-4fc4-baa5-79ac98704bcd" (UID: "b3d1261b-4146-4fc4-baa5-79ac98704bcd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.866404 4756 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.866444 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnzv8\" (UniqueName: \"kubernetes.io/projected/b3d1261b-4146-4fc4-baa5-79ac98704bcd-kube-api-access-xnzv8\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.866457 4756 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.866466 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:51 crc kubenswrapper[4756]: I0318 14:36:51.866476 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3d1261b-4146-4fc4-baa5-79ac98704bcd-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.067224 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" event={"ID":"b3d1261b-4146-4fc4-baa5-79ac98704bcd","Type":"ContainerDied","Data":"1a48015da914df1330fdc79dc13bf15497a90ffebb4603fa151a88d0ba4c3f40"} Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.067450 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a48015da914df1330fdc79dc13bf15497a90ffebb4603fa151a88d0ba4c3f40" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.067522 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b68mn" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.156767 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c"] Mar 18 14:36:52 crc kubenswrapper[4756]: E0318 14:36:52.157653 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9d333c-02e3-4961-8d6d-ac10e68424f6" containerName="extract-content" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.157673 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9d333c-02e3-4961-8d6d-ac10e68424f6" containerName="extract-content" Mar 18 14:36:52 crc kubenswrapper[4756]: E0318 14:36:52.157697 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9d333c-02e3-4961-8d6d-ac10e68424f6" containerName="extract-utilities" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.157704 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9d333c-02e3-4961-8d6d-ac10e68424f6" containerName="extract-utilities" Mar 18 14:36:52 crc kubenswrapper[4756]: E0318 14:36:52.157715 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9d333c-02e3-4961-8d6d-ac10e68424f6" containerName="registry-server" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.157721 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9d333c-02e3-4961-8d6d-ac10e68424f6" containerName="registry-server" Mar 18 14:36:52 crc kubenswrapper[4756]: E0318 14:36:52.157738 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d1261b-4146-4fc4-baa5-79ac98704bcd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.157746 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d1261b-4146-4fc4-baa5-79ac98704bcd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.157950 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce9d333c-02e3-4961-8d6d-ac10e68424f6" containerName="registry-server" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.157967 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d1261b-4146-4fc4-baa5-79ac98704bcd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.158716 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.161367 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.161597 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.161913 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.162047 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.162195 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.162726 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.170617 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.170666 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfkcd\" (UniqueName: \"kubernetes.io/projected/044cc30f-7755-47c5-8b78-84c89ee897bf-kube-api-access-xfkcd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.170921 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.170955 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.171054 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.171088 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.180347 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c"] Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.272098 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfkcd\" (UniqueName: \"kubernetes.io/projected/044cc30f-7755-47c5-8b78-84c89ee897bf-kube-api-access-xfkcd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.272206 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.272226 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.272261 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.272301 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.272899 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.276902 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.276914 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.277015 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.277043 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.277382 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.291342 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfkcd\" (UniqueName: \"kubernetes.io/projected/044cc30f-7755-47c5-8b78-84c89ee897bf-kube-api-access-xfkcd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:52 crc kubenswrapper[4756]: I0318 14:36:52.478528 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:36:53 crc kubenswrapper[4756]: I0318 14:36:53.059244 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c"] Mar 18 14:36:53 crc kubenswrapper[4756]: I0318 14:36:53.059277 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:36:53 crc kubenswrapper[4756]: I0318 14:36:53.096484 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" event={"ID":"044cc30f-7755-47c5-8b78-84c89ee897bf","Type":"ContainerStarted","Data":"564d79f4df1e5e622d6d8427370354a50a956985cc061baa642e3a1a13a2c2e1"} Mar 18 14:36:54 crc kubenswrapper[4756]: I0318 14:36:54.108294 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" event={"ID":"044cc30f-7755-47c5-8b78-84c89ee897bf","Type":"ContainerStarted","Data":"7ccc248ba882cf3971aa4f6082dc6e45d7e493e5297c48f314a9ff671ab031fe"} Mar 18 14:36:54 crc kubenswrapper[4756]: I0318 14:36:54.131870 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" podStartSLOduration=1.7009306309999999 podStartE2EDuration="2.131850043s" podCreationTimestamp="2026-03-18 14:36:52 +0000 UTC" firstStartedPulling="2026-03-18 14:36:53.058974827 +0000 UTC m=+2214.373392812" lastFinishedPulling="2026-03-18 14:36:53.489894249 +0000 UTC m=+2214.804312224" observedRunningTime="2026-03-18 14:36:54.1272628 +0000 UTC m=+2215.441680775" watchObservedRunningTime="2026-03-18 14:36:54.131850043 +0000 UTC m=+2215.446268018" Mar 18 14:37:06 crc kubenswrapper[4756]: I0318 14:37:06.915915 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:37:06 crc kubenswrapper[4756]: I0318 14:37:06.916558 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:37:18 crc kubenswrapper[4756]: I0318 14:37:18.973104 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-clwwj"] Mar 18 14:37:18 crc kubenswrapper[4756]: I0318 14:37:18.978275 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:18 crc kubenswrapper[4756]: I0318 14:37:18.998149 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-clwwj"] Mar 18 14:37:19 crc kubenswrapper[4756]: I0318 14:37:19.110008 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d377e13c-3048-4675-898a-97ed00be93a2-utilities\") pod \"redhat-marketplace-clwwj\" (UID: \"d377e13c-3048-4675-898a-97ed00be93a2\") " pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:19 crc kubenswrapper[4756]: I0318 14:37:19.110064 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d377e13c-3048-4675-898a-97ed00be93a2-catalog-content\") pod \"redhat-marketplace-clwwj\" (UID: \"d377e13c-3048-4675-898a-97ed00be93a2\") " pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:19 crc kubenswrapper[4756]: I0318 14:37:19.110317 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncdkm\" (UniqueName: \"kubernetes.io/projected/d377e13c-3048-4675-898a-97ed00be93a2-kube-api-access-ncdkm\") pod \"redhat-marketplace-clwwj\" (UID: \"d377e13c-3048-4675-898a-97ed00be93a2\") " pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:19 crc kubenswrapper[4756]: I0318 14:37:19.211792 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d377e13c-3048-4675-898a-97ed00be93a2-utilities\") pod \"redhat-marketplace-clwwj\" (UID: \"d377e13c-3048-4675-898a-97ed00be93a2\") " pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:19 crc kubenswrapper[4756]: I0318 14:37:19.211841 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d377e13c-3048-4675-898a-97ed00be93a2-catalog-content\") pod \"redhat-marketplace-clwwj\" (UID: \"d377e13c-3048-4675-898a-97ed00be93a2\") " pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:19 crc kubenswrapper[4756]: I0318 14:37:19.211950 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncdkm\" (UniqueName: \"kubernetes.io/projected/d377e13c-3048-4675-898a-97ed00be93a2-kube-api-access-ncdkm\") pod \"redhat-marketplace-clwwj\" (UID: \"d377e13c-3048-4675-898a-97ed00be93a2\") " pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:19 crc kubenswrapper[4756]: I0318 14:37:19.212694 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d377e13c-3048-4675-898a-97ed00be93a2-utilities\") pod \"redhat-marketplace-clwwj\" (UID: \"d377e13c-3048-4675-898a-97ed00be93a2\") " pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:19 crc kubenswrapper[4756]: I0318 14:37:19.212906 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d377e13c-3048-4675-898a-97ed00be93a2-catalog-content\") pod \"redhat-marketplace-clwwj\" (UID: \"d377e13c-3048-4675-898a-97ed00be93a2\") " pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:19 crc kubenswrapper[4756]: I0318 14:37:19.230228 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncdkm\" (UniqueName: \"kubernetes.io/projected/d377e13c-3048-4675-898a-97ed00be93a2-kube-api-access-ncdkm\") pod \"redhat-marketplace-clwwj\" (UID: \"d377e13c-3048-4675-898a-97ed00be93a2\") " pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:19 crc kubenswrapper[4756]: I0318 14:37:19.309582 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:19 crc kubenswrapper[4756]: I0318 14:37:19.831932 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-clwwj"] Mar 18 14:37:20 crc kubenswrapper[4756]: I0318 14:37:20.442942 4756 generic.go:334] "Generic (PLEG): container finished" podID="d377e13c-3048-4675-898a-97ed00be93a2" containerID="96244c2b49d8fa255b43eeed1f91b9f24ef939e5fa533f37597f45cfb46f74c3" exitCode=0 Mar 18 14:37:20 crc kubenswrapper[4756]: I0318 14:37:20.443214 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-clwwj" event={"ID":"d377e13c-3048-4675-898a-97ed00be93a2","Type":"ContainerDied","Data":"96244c2b49d8fa255b43eeed1f91b9f24ef939e5fa533f37597f45cfb46f74c3"} Mar 18 14:37:20 crc kubenswrapper[4756]: I0318 14:37:20.443241 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-clwwj" event={"ID":"d377e13c-3048-4675-898a-97ed00be93a2","Type":"ContainerStarted","Data":"6235f1e38e5337cf64cb45e4e543b2903b666a02f2adb4dd77e28ca0476a35bd"} Mar 18 14:37:21 crc kubenswrapper[4756]: I0318 14:37:21.453945 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-clwwj" event={"ID":"d377e13c-3048-4675-898a-97ed00be93a2","Type":"ContainerStarted","Data":"59d7ca5a55449accdac58ec6a276503ed5f63d7a1634a40ad7185e36e37ed536"} Mar 18 14:37:22 crc kubenswrapper[4756]: I0318 14:37:22.471871 4756 generic.go:334] "Generic (PLEG): container finished" podID="d377e13c-3048-4675-898a-97ed00be93a2" containerID="59d7ca5a55449accdac58ec6a276503ed5f63d7a1634a40ad7185e36e37ed536" exitCode=0 Mar 18 14:37:22 crc kubenswrapper[4756]: I0318 14:37:22.471946 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-clwwj" event={"ID":"d377e13c-3048-4675-898a-97ed00be93a2","Type":"ContainerDied","Data":"59d7ca5a55449accdac58ec6a276503ed5f63d7a1634a40ad7185e36e37ed536"} Mar 18 14:37:23 crc kubenswrapper[4756]: I0318 14:37:23.485818 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-clwwj" event={"ID":"d377e13c-3048-4675-898a-97ed00be93a2","Type":"ContainerStarted","Data":"6daae01339913d90df83d54551c9ba2aff45533f8adcb1be227e37bb1edc874f"} Mar 18 14:37:23 crc kubenswrapper[4756]: I0318 14:37:23.515348 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-clwwj" podStartSLOduration=3.047063028 podStartE2EDuration="5.515333116s" podCreationTimestamp="2026-03-18 14:37:18 +0000 UTC" firstStartedPulling="2026-03-18 14:37:20.445742631 +0000 UTC m=+2241.760160606" lastFinishedPulling="2026-03-18 14:37:22.914012719 +0000 UTC m=+2244.228430694" observedRunningTime="2026-03-18 14:37:23.510982179 +0000 UTC m=+2244.825400164" watchObservedRunningTime="2026-03-18 14:37:23.515333116 +0000 UTC m=+2244.829751091" Mar 18 14:37:29 crc kubenswrapper[4756]: I0318 14:37:29.309834 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:29 crc kubenswrapper[4756]: I0318 14:37:29.310431 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:29 crc kubenswrapper[4756]: I0318 14:37:29.351181 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:29 crc kubenswrapper[4756]: I0318 14:37:29.597831 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:29 crc kubenswrapper[4756]: I0318 14:37:29.650153 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-clwwj"] Mar 18 14:37:31 crc kubenswrapper[4756]: I0318 14:37:31.570932 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-clwwj" podUID="d377e13c-3048-4675-898a-97ed00be93a2" containerName="registry-server" containerID="cri-o://6daae01339913d90df83d54551c9ba2aff45533f8adcb1be227e37bb1edc874f" gracePeriod=2 Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.092222 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.195525 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d377e13c-3048-4675-898a-97ed00be93a2-utilities\") pod \"d377e13c-3048-4675-898a-97ed00be93a2\" (UID: \"d377e13c-3048-4675-898a-97ed00be93a2\") " Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.195596 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d377e13c-3048-4675-898a-97ed00be93a2-catalog-content\") pod \"d377e13c-3048-4675-898a-97ed00be93a2\" (UID: \"d377e13c-3048-4675-898a-97ed00be93a2\") " Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.195622 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncdkm\" (UniqueName: \"kubernetes.io/projected/d377e13c-3048-4675-898a-97ed00be93a2-kube-api-access-ncdkm\") pod \"d377e13c-3048-4675-898a-97ed00be93a2\" (UID: \"d377e13c-3048-4675-898a-97ed00be93a2\") " Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.196840 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d377e13c-3048-4675-898a-97ed00be93a2-utilities" (OuterVolumeSpecName: "utilities") pod "d377e13c-3048-4675-898a-97ed00be93a2" (UID: "d377e13c-3048-4675-898a-97ed00be93a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.202482 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d377e13c-3048-4675-898a-97ed00be93a2-kube-api-access-ncdkm" (OuterVolumeSpecName: "kube-api-access-ncdkm") pod "d377e13c-3048-4675-898a-97ed00be93a2" (UID: "d377e13c-3048-4675-898a-97ed00be93a2"). InnerVolumeSpecName "kube-api-access-ncdkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.245723 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d377e13c-3048-4675-898a-97ed00be93a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d377e13c-3048-4675-898a-97ed00be93a2" (UID: "d377e13c-3048-4675-898a-97ed00be93a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.298745 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d377e13c-3048-4675-898a-97ed00be93a2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.298794 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d377e13c-3048-4675-898a-97ed00be93a2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.298810 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncdkm\" (UniqueName: \"kubernetes.io/projected/d377e13c-3048-4675-898a-97ed00be93a2-kube-api-access-ncdkm\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.583067 4756 generic.go:334] "Generic (PLEG): container finished" podID="d377e13c-3048-4675-898a-97ed00be93a2" containerID="6daae01339913d90df83d54551c9ba2aff45533f8adcb1be227e37bb1edc874f" exitCode=0 Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.583165 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-clwwj" event={"ID":"d377e13c-3048-4675-898a-97ed00be93a2","Type":"ContainerDied","Data":"6daae01339913d90df83d54551c9ba2aff45533f8adcb1be227e37bb1edc874f"} Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.583220 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-clwwj" event={"ID":"d377e13c-3048-4675-898a-97ed00be93a2","Type":"ContainerDied","Data":"6235f1e38e5337cf64cb45e4e543b2903b666a02f2adb4dd77e28ca0476a35bd"} Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.583231 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-clwwj" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.583252 4756 scope.go:117] "RemoveContainer" containerID="6daae01339913d90df83d54551c9ba2aff45533f8adcb1be227e37bb1edc874f" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.609962 4756 scope.go:117] "RemoveContainer" containerID="59d7ca5a55449accdac58ec6a276503ed5f63d7a1634a40ad7185e36e37ed536" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.639230 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-clwwj"] Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.647017 4756 scope.go:117] "RemoveContainer" containerID="96244c2b49d8fa255b43eeed1f91b9f24ef939e5fa533f37597f45cfb46f74c3" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.652368 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-clwwj"] Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.706297 4756 scope.go:117] "RemoveContainer" containerID="6daae01339913d90df83d54551c9ba2aff45533f8adcb1be227e37bb1edc874f" Mar 18 14:37:32 crc kubenswrapper[4756]: E0318 14:37:32.707204 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6daae01339913d90df83d54551c9ba2aff45533f8adcb1be227e37bb1edc874f\": container with ID starting with 6daae01339913d90df83d54551c9ba2aff45533f8adcb1be227e37bb1edc874f not found: ID does not exist" containerID="6daae01339913d90df83d54551c9ba2aff45533f8adcb1be227e37bb1edc874f" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.707244 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6daae01339913d90df83d54551c9ba2aff45533f8adcb1be227e37bb1edc874f"} err="failed to get container status \"6daae01339913d90df83d54551c9ba2aff45533f8adcb1be227e37bb1edc874f\": rpc error: code = NotFound desc = could not find container \"6daae01339913d90df83d54551c9ba2aff45533f8adcb1be227e37bb1edc874f\": container with ID starting with 6daae01339913d90df83d54551c9ba2aff45533f8adcb1be227e37bb1edc874f not found: ID does not exist" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.707291 4756 scope.go:117] "RemoveContainer" containerID="59d7ca5a55449accdac58ec6a276503ed5f63d7a1634a40ad7185e36e37ed536" Mar 18 14:37:32 crc kubenswrapper[4756]: E0318 14:37:32.707932 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d7ca5a55449accdac58ec6a276503ed5f63d7a1634a40ad7185e36e37ed536\": container with ID starting with 59d7ca5a55449accdac58ec6a276503ed5f63d7a1634a40ad7185e36e37ed536 not found: ID does not exist" containerID="59d7ca5a55449accdac58ec6a276503ed5f63d7a1634a40ad7185e36e37ed536" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.707952 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d7ca5a55449accdac58ec6a276503ed5f63d7a1634a40ad7185e36e37ed536"} err="failed to get container status \"59d7ca5a55449accdac58ec6a276503ed5f63d7a1634a40ad7185e36e37ed536\": rpc error: code = NotFound desc = could not find container \"59d7ca5a55449accdac58ec6a276503ed5f63d7a1634a40ad7185e36e37ed536\": container with ID starting with 59d7ca5a55449accdac58ec6a276503ed5f63d7a1634a40ad7185e36e37ed536 not found: ID does not exist" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.707965 4756 scope.go:117] "RemoveContainer" containerID="96244c2b49d8fa255b43eeed1f91b9f24ef939e5fa533f37597f45cfb46f74c3" Mar 18 14:37:32 crc kubenswrapper[4756]: E0318 14:37:32.711704 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96244c2b49d8fa255b43eeed1f91b9f24ef939e5fa533f37597f45cfb46f74c3\": container with ID starting with 96244c2b49d8fa255b43eeed1f91b9f24ef939e5fa533f37597f45cfb46f74c3 not found: ID does not exist" containerID="96244c2b49d8fa255b43eeed1f91b9f24ef939e5fa533f37597f45cfb46f74c3" Mar 18 14:37:32 crc kubenswrapper[4756]: I0318 14:37:32.711811 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96244c2b49d8fa255b43eeed1f91b9f24ef939e5fa533f37597f45cfb46f74c3"} err="failed to get container status \"96244c2b49d8fa255b43eeed1f91b9f24ef939e5fa533f37597f45cfb46f74c3\": rpc error: code = NotFound desc = could not find container \"96244c2b49d8fa255b43eeed1f91b9f24ef939e5fa533f37597f45cfb46f74c3\": container with ID starting with 96244c2b49d8fa255b43eeed1f91b9f24ef939e5fa533f37597f45cfb46f74c3 not found: ID does not exist" Mar 18 14:37:33 crc kubenswrapper[4756]: I0318 14:37:33.357179 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d377e13c-3048-4675-898a-97ed00be93a2" path="/var/lib/kubelet/pods/d377e13c-3048-4675-898a-97ed00be93a2/volumes" Mar 18 14:37:36 crc kubenswrapper[4756]: I0318 14:37:36.915139 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:37:36 crc kubenswrapper[4756]: I0318 14:37:36.915580 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:37:41 crc kubenswrapper[4756]: I0318 14:37:41.683259 4756 generic.go:334] "Generic (PLEG): container finished" podID="044cc30f-7755-47c5-8b78-84c89ee897bf" containerID="7ccc248ba882cf3971aa4f6082dc6e45d7e493e5297c48f314a9ff671ab031fe" exitCode=0 Mar 18 14:37:41 crc kubenswrapper[4756]: I0318 14:37:41.683357 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" event={"ID":"044cc30f-7755-47c5-8b78-84c89ee897bf","Type":"ContainerDied","Data":"7ccc248ba882cf3971aa4f6082dc6e45d7e493e5297c48f314a9ff671ab031fe"} Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.166716 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.245592 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-nova-metadata-neutron-config-0\") pod \"044cc30f-7755-47c5-8b78-84c89ee897bf\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.245672 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfkcd\" (UniqueName: \"kubernetes.io/projected/044cc30f-7755-47c5-8b78-84c89ee897bf-kube-api-access-xfkcd\") pod \"044cc30f-7755-47c5-8b78-84c89ee897bf\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.245791 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-neutron-ovn-metadata-agent-neutron-config-0\") pod \"044cc30f-7755-47c5-8b78-84c89ee897bf\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.245816 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-ssh-key-openstack-edpm-ipam\") pod \"044cc30f-7755-47c5-8b78-84c89ee897bf\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.246624 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-inventory\") pod \"044cc30f-7755-47c5-8b78-84c89ee897bf\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.246694 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-neutron-metadata-combined-ca-bundle\") pod \"044cc30f-7755-47c5-8b78-84c89ee897bf\" (UID: \"044cc30f-7755-47c5-8b78-84c89ee897bf\") " Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.251231 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/044cc30f-7755-47c5-8b78-84c89ee897bf-kube-api-access-xfkcd" (OuterVolumeSpecName: "kube-api-access-xfkcd") pod "044cc30f-7755-47c5-8b78-84c89ee897bf" (UID: "044cc30f-7755-47c5-8b78-84c89ee897bf"). InnerVolumeSpecName "kube-api-access-xfkcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.261600 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "044cc30f-7755-47c5-8b78-84c89ee897bf" (UID: "044cc30f-7755-47c5-8b78-84c89ee897bf"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.275407 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-inventory" (OuterVolumeSpecName: "inventory") pod "044cc30f-7755-47c5-8b78-84c89ee897bf" (UID: "044cc30f-7755-47c5-8b78-84c89ee897bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.277832 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "044cc30f-7755-47c5-8b78-84c89ee897bf" (UID: "044cc30f-7755-47c5-8b78-84c89ee897bf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.278296 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "044cc30f-7755-47c5-8b78-84c89ee897bf" (UID: "044cc30f-7755-47c5-8b78-84c89ee897bf"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.287697 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "044cc30f-7755-47c5-8b78-84c89ee897bf" (UID: "044cc30f-7755-47c5-8b78-84c89ee897bf"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.349671 4756 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.349709 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.349723 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.349735 4756 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.349748 4756 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/044cc30f-7755-47c5-8b78-84c89ee897bf-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.349761 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfkcd\" (UniqueName: \"kubernetes.io/projected/044cc30f-7755-47c5-8b78-84c89ee897bf-kube-api-access-xfkcd\") on node \"crc\" DevicePath \"\"" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.703783 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" event={"ID":"044cc30f-7755-47c5-8b78-84c89ee897bf","Type":"ContainerDied","Data":"564d79f4df1e5e622d6d8427370354a50a956985cc061baa642e3a1a13a2c2e1"} Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.703840 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="564d79f4df1e5e622d6d8427370354a50a956985cc061baa642e3a1a13a2c2e1" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.703812 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.841588 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f"] Mar 18 14:37:43 crc kubenswrapper[4756]: E0318 14:37:43.842345 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d377e13c-3048-4675-898a-97ed00be93a2" containerName="registry-server" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.842434 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d377e13c-3048-4675-898a-97ed00be93a2" containerName="registry-server" Mar 18 14:37:43 crc kubenswrapper[4756]: E0318 14:37:43.842519 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d377e13c-3048-4675-898a-97ed00be93a2" containerName="extract-utilities" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.842581 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d377e13c-3048-4675-898a-97ed00be93a2" containerName="extract-utilities" Mar 18 14:37:43 crc kubenswrapper[4756]: E0318 14:37:43.842654 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="044cc30f-7755-47c5-8b78-84c89ee897bf" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.842718 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="044cc30f-7755-47c5-8b78-84c89ee897bf" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 14:37:43 crc kubenswrapper[4756]: E0318 14:37:43.842816 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d377e13c-3048-4675-898a-97ed00be93a2" containerName="extract-content" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.842882 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="d377e13c-3048-4675-898a-97ed00be93a2" containerName="extract-content" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.843226 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="044cc30f-7755-47c5-8b78-84c89ee897bf" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.843357 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="d377e13c-3048-4675-898a-97ed00be93a2" containerName="registry-server" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.844413 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.851277 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.851333 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.851402 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.852236 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.853898 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.856198 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f"] Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.961314 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.961434 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.961508 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.961599 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b2bj\" (UniqueName: \"kubernetes.io/projected/2a04f637-4e7d-4efc-8fc0-ce511f450960-kube-api-access-9b2bj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:43 crc kubenswrapper[4756]: I0318 14:37:43.961703 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:44 crc kubenswrapper[4756]: I0318 14:37:44.063887 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:44 crc kubenswrapper[4756]: I0318 14:37:44.063948 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:44 crc kubenswrapper[4756]: I0318 14:37:44.063991 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:44 crc kubenswrapper[4756]: I0318 14:37:44.064056 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b2bj\" (UniqueName: \"kubernetes.io/projected/2a04f637-4e7d-4efc-8fc0-ce511f450960-kube-api-access-9b2bj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:44 crc kubenswrapper[4756]: I0318 14:37:44.064140 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:44 crc kubenswrapper[4756]: I0318 14:37:44.068182 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:44 crc kubenswrapper[4756]: I0318 14:37:44.068698 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:44 crc kubenswrapper[4756]: I0318 14:37:44.078888 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:44 crc kubenswrapper[4756]: I0318 14:37:44.079726 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:44 crc kubenswrapper[4756]: I0318 14:37:44.088234 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b2bj\" (UniqueName: \"kubernetes.io/projected/2a04f637-4e7d-4efc-8fc0-ce511f450960-kube-api-access-9b2bj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xft9f\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:44 crc kubenswrapper[4756]: I0318 14:37:44.169405 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:37:44 crc kubenswrapper[4756]: I0318 14:37:44.745572 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f"] Mar 18 14:37:45 crc kubenswrapper[4756]: I0318 14:37:45.727991 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" event={"ID":"2a04f637-4e7d-4efc-8fc0-ce511f450960","Type":"ContainerStarted","Data":"88a47deb3e11a162647b9f55a481a055d711eef8dbd65fc3b3988d58ae5cb7dd"} Mar 18 14:37:45 crc kubenswrapper[4756]: I0318 14:37:45.728437 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" event={"ID":"2a04f637-4e7d-4efc-8fc0-ce511f450960","Type":"ContainerStarted","Data":"d7dc9223db5733eb0779ad6918d046f49b1056a10f4f7b1cc815a07f3c6fc947"} Mar 18 14:37:45 crc kubenswrapper[4756]: I0318 14:37:45.745802 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" podStartSLOduration=2.231824705 podStartE2EDuration="2.745781676s" podCreationTimestamp="2026-03-18 14:37:43 +0000 UTC" firstStartedPulling="2026-03-18 14:37:44.748368266 +0000 UTC m=+2266.062786241" lastFinishedPulling="2026-03-18 14:37:45.262325237 +0000 UTC m=+2266.576743212" observedRunningTime="2026-03-18 14:37:45.741225913 +0000 UTC m=+2267.055643908" watchObservedRunningTime="2026-03-18 14:37:45.745781676 +0000 UTC m=+2267.060199651" Mar 18 14:38:00 crc kubenswrapper[4756]: I0318 14:38:00.149593 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564078-hh6wl"] Mar 18 14:38:00 crc kubenswrapper[4756]: I0318 14:38:00.152782 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564078-hh6wl" Mar 18 14:38:00 crc kubenswrapper[4756]: I0318 14:38:00.155518 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:38:00 crc kubenswrapper[4756]: I0318 14:38:00.155821 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:38:00 crc kubenswrapper[4756]: I0318 14:38:00.156820 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:38:00 crc kubenswrapper[4756]: I0318 14:38:00.160923 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564078-hh6wl"] Mar 18 14:38:00 crc kubenswrapper[4756]: I0318 14:38:00.250967 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd7q5\" (UniqueName: \"kubernetes.io/projected/daa2530d-df7e-4e7d-ae6e-880282674a28-kube-api-access-vd7q5\") pod \"auto-csr-approver-29564078-hh6wl\" (UID: \"daa2530d-df7e-4e7d-ae6e-880282674a28\") " pod="openshift-infra/auto-csr-approver-29564078-hh6wl" Mar 18 14:38:00 crc kubenswrapper[4756]: I0318 14:38:00.353930 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd7q5\" (UniqueName: \"kubernetes.io/projected/daa2530d-df7e-4e7d-ae6e-880282674a28-kube-api-access-vd7q5\") pod \"auto-csr-approver-29564078-hh6wl\" (UID: \"daa2530d-df7e-4e7d-ae6e-880282674a28\") " pod="openshift-infra/auto-csr-approver-29564078-hh6wl" Mar 18 14:38:00 crc kubenswrapper[4756]: I0318 14:38:00.376428 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd7q5\" (UniqueName: \"kubernetes.io/projected/daa2530d-df7e-4e7d-ae6e-880282674a28-kube-api-access-vd7q5\") pod \"auto-csr-approver-29564078-hh6wl\" (UID: \"daa2530d-df7e-4e7d-ae6e-880282674a28\") " pod="openshift-infra/auto-csr-approver-29564078-hh6wl" Mar 18 14:38:00 crc kubenswrapper[4756]: I0318 14:38:00.477667 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564078-hh6wl" Mar 18 14:38:00 crc kubenswrapper[4756]: W0318 14:38:00.997946 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaa2530d_df7e_4e7d_ae6e_880282674a28.slice/crio-a03222ec132a172f4dbd9c3cbf263840fe4f341ced7aa94107908d0dd94dc224 WatchSource:0}: Error finding container a03222ec132a172f4dbd9c3cbf263840fe4f341ced7aa94107908d0dd94dc224: Status 404 returned error can't find the container with id a03222ec132a172f4dbd9c3cbf263840fe4f341ced7aa94107908d0dd94dc224 Mar 18 14:38:01 crc kubenswrapper[4756]: I0318 14:38:00.999994 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564078-hh6wl"] Mar 18 14:38:01 crc kubenswrapper[4756]: I0318 14:38:01.924883 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564078-hh6wl" event={"ID":"daa2530d-df7e-4e7d-ae6e-880282674a28","Type":"ContainerStarted","Data":"a03222ec132a172f4dbd9c3cbf263840fe4f341ced7aa94107908d0dd94dc224"} Mar 18 14:38:02 crc kubenswrapper[4756]: I0318 14:38:02.935018 4756 generic.go:334] "Generic (PLEG): container finished" podID="daa2530d-df7e-4e7d-ae6e-880282674a28" containerID="96efdf452898e5c0f6e3407dbd835ebd1df397036229e13aebe150374721e5d9" exitCode=0 Mar 18 14:38:02 crc kubenswrapper[4756]: I0318 14:38:02.935549 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564078-hh6wl" event={"ID":"daa2530d-df7e-4e7d-ae6e-880282674a28","Type":"ContainerDied","Data":"96efdf452898e5c0f6e3407dbd835ebd1df397036229e13aebe150374721e5d9"} Mar 18 14:38:04 crc kubenswrapper[4756]: I0318 14:38:04.360879 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564078-hh6wl" Mar 18 14:38:04 crc kubenswrapper[4756]: I0318 14:38:04.543099 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd7q5\" (UniqueName: \"kubernetes.io/projected/daa2530d-df7e-4e7d-ae6e-880282674a28-kube-api-access-vd7q5\") pod \"daa2530d-df7e-4e7d-ae6e-880282674a28\" (UID: \"daa2530d-df7e-4e7d-ae6e-880282674a28\") " Mar 18 14:38:04 crc kubenswrapper[4756]: I0318 14:38:04.548652 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daa2530d-df7e-4e7d-ae6e-880282674a28-kube-api-access-vd7q5" (OuterVolumeSpecName: "kube-api-access-vd7q5") pod "daa2530d-df7e-4e7d-ae6e-880282674a28" (UID: "daa2530d-df7e-4e7d-ae6e-880282674a28"). InnerVolumeSpecName "kube-api-access-vd7q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:38:04 crc kubenswrapper[4756]: I0318 14:38:04.646472 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd7q5\" (UniqueName: \"kubernetes.io/projected/daa2530d-df7e-4e7d-ae6e-880282674a28-kube-api-access-vd7q5\") on node \"crc\" DevicePath \"\"" Mar 18 14:38:04 crc kubenswrapper[4756]: I0318 14:38:04.956706 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564078-hh6wl" event={"ID":"daa2530d-df7e-4e7d-ae6e-880282674a28","Type":"ContainerDied","Data":"a03222ec132a172f4dbd9c3cbf263840fe4f341ced7aa94107908d0dd94dc224"} Mar 18 14:38:04 crc kubenswrapper[4756]: I0318 14:38:04.956759 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a03222ec132a172f4dbd9c3cbf263840fe4f341ced7aa94107908d0dd94dc224" Mar 18 14:38:04 crc kubenswrapper[4756]: I0318 14:38:04.956792 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564078-hh6wl" Mar 18 14:38:05 crc kubenswrapper[4756]: I0318 14:38:05.438993 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564072-n29j8"] Mar 18 14:38:05 crc kubenswrapper[4756]: I0318 14:38:05.451705 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564072-n29j8"] Mar 18 14:38:06 crc kubenswrapper[4756]: I0318 14:38:06.915190 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:38:06 crc kubenswrapper[4756]: I0318 14:38:06.915261 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:38:06 crc kubenswrapper[4756]: I0318 14:38:06.915322 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:38:06 crc kubenswrapper[4756]: I0318 14:38:06.916004 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47fe16283e653d97b821ea41888e12ff278529bb4d88752b5670809f30122a6d"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:38:06 crc kubenswrapper[4756]: I0318 14:38:06.916072 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://47fe16283e653d97b821ea41888e12ff278529bb4d88752b5670809f30122a6d" gracePeriod=600 Mar 18 14:38:07 crc kubenswrapper[4756]: I0318 14:38:07.326508 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50619664-2536-4ae7-b393-863d44c0e69c" path="/var/lib/kubelet/pods/50619664-2536-4ae7-b393-863d44c0e69c/volumes" Mar 18 14:38:07 crc kubenswrapper[4756]: I0318 14:38:07.998427 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="47fe16283e653d97b821ea41888e12ff278529bb4d88752b5670809f30122a6d" exitCode=0 Mar 18 14:38:07 crc kubenswrapper[4756]: I0318 14:38:07.998503 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"47fe16283e653d97b821ea41888e12ff278529bb4d88752b5670809f30122a6d"} Mar 18 14:38:07 crc kubenswrapper[4756]: I0318 14:38:07.998819 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b"} Mar 18 14:38:07 crc kubenswrapper[4756]: I0318 14:38:07.998839 4756 scope.go:117] "RemoveContainer" containerID="8ba7b31e4ac9af18643c7f9744aaa7483d4c1a3f7cea50cb3447a0e603dba323" Mar 18 14:38:33 crc kubenswrapper[4756]: I0318 14:38:33.270837 4756 scope.go:117] "RemoveContainer" containerID="ae1ea28a89ba42c44eb60c7c1996e89cf44c02233749c26618fe767a6a8591d0" Mar 18 14:40:00 crc kubenswrapper[4756]: I0318 14:40:00.160689 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564080-g6g8x"] Mar 18 14:40:00 crc kubenswrapper[4756]: E0318 14:40:00.161952 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa2530d-df7e-4e7d-ae6e-880282674a28" containerName="oc" Mar 18 14:40:00 crc kubenswrapper[4756]: I0318 14:40:00.161977 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa2530d-df7e-4e7d-ae6e-880282674a28" containerName="oc" Mar 18 14:40:00 crc kubenswrapper[4756]: I0318 14:40:00.162365 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="daa2530d-df7e-4e7d-ae6e-880282674a28" containerName="oc" Mar 18 14:40:00 crc kubenswrapper[4756]: I0318 14:40:00.163389 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564080-g6g8x" Mar 18 14:40:00 crc kubenswrapper[4756]: I0318 14:40:00.165958 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:40:00 crc kubenswrapper[4756]: I0318 14:40:00.166311 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:40:00 crc kubenswrapper[4756]: I0318 14:40:00.166997 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:40:00 crc kubenswrapper[4756]: I0318 14:40:00.175994 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564080-g6g8x"] Mar 18 14:40:00 crc kubenswrapper[4756]: I0318 14:40:00.218635 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvfnx\" (UniqueName: \"kubernetes.io/projected/f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a-kube-api-access-cvfnx\") pod \"auto-csr-approver-29564080-g6g8x\" (UID: \"f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a\") " pod="openshift-infra/auto-csr-approver-29564080-g6g8x" Mar 18 14:40:00 crc kubenswrapper[4756]: I0318 14:40:00.323328 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvfnx\" (UniqueName: \"kubernetes.io/projected/f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a-kube-api-access-cvfnx\") pod \"auto-csr-approver-29564080-g6g8x\" (UID: \"f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a\") " pod="openshift-infra/auto-csr-approver-29564080-g6g8x" Mar 18 14:40:00 crc kubenswrapper[4756]: I0318 14:40:00.346942 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvfnx\" (UniqueName: \"kubernetes.io/projected/f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a-kube-api-access-cvfnx\") pod \"auto-csr-approver-29564080-g6g8x\" (UID: \"f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a\") " pod="openshift-infra/auto-csr-approver-29564080-g6g8x" Mar 18 14:40:00 crc kubenswrapper[4756]: I0318 14:40:00.490768 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564080-g6g8x" Mar 18 14:40:01 crc kubenswrapper[4756]: I0318 14:40:01.029960 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564080-g6g8x"] Mar 18 14:40:01 crc kubenswrapper[4756]: W0318 14:40:01.042230 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5937f0b_1f51_4d2a_80ed_bbcb89dc4a5a.slice/crio-ea808353edd05515bdac1932550c5f20e6890741def7ffc69df47feca5078e52 WatchSource:0}: Error finding container ea808353edd05515bdac1932550c5f20e6890741def7ffc69df47feca5078e52: Status 404 returned error can't find the container with id ea808353edd05515bdac1932550c5f20e6890741def7ffc69df47feca5078e52 Mar 18 14:40:01 crc kubenswrapper[4756]: I0318 14:40:01.919785 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564080-g6g8x" event={"ID":"f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a","Type":"ContainerStarted","Data":"ea808353edd05515bdac1932550c5f20e6890741def7ffc69df47feca5078e52"} Mar 18 14:40:02 crc kubenswrapper[4756]: I0318 14:40:02.930773 4756 generic.go:334] "Generic (PLEG): container finished" podID="f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a" containerID="524378a1c140443e169efea7ab05b80aed2df96b7a5731023c4773af7ad641f3" exitCode=0 Mar 18 14:40:02 crc kubenswrapper[4756]: I0318 14:40:02.930823 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564080-g6g8x" event={"ID":"f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a","Type":"ContainerDied","Data":"524378a1c140443e169efea7ab05b80aed2df96b7a5731023c4773af7ad641f3"} Mar 18 14:40:04 crc kubenswrapper[4756]: I0318 14:40:04.334482 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564080-g6g8x" Mar 18 14:40:04 crc kubenswrapper[4756]: I0318 14:40:04.514644 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvfnx\" (UniqueName: \"kubernetes.io/projected/f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a-kube-api-access-cvfnx\") pod \"f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a\" (UID: \"f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a\") " Mar 18 14:40:04 crc kubenswrapper[4756]: I0318 14:40:04.521898 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a-kube-api-access-cvfnx" (OuterVolumeSpecName: "kube-api-access-cvfnx") pod "f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a" (UID: "f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a"). InnerVolumeSpecName "kube-api-access-cvfnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:40:04 crc kubenswrapper[4756]: I0318 14:40:04.618871 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvfnx\" (UniqueName: \"kubernetes.io/projected/f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a-kube-api-access-cvfnx\") on node \"crc\" DevicePath \"\"" Mar 18 14:40:04 crc kubenswrapper[4756]: I0318 14:40:04.974978 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564080-g6g8x" event={"ID":"f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a","Type":"ContainerDied","Data":"ea808353edd05515bdac1932550c5f20e6890741def7ffc69df47feca5078e52"} Mar 18 14:40:04 crc kubenswrapper[4756]: I0318 14:40:04.975057 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea808353edd05515bdac1932550c5f20e6890741def7ffc69df47feca5078e52" Mar 18 14:40:04 crc kubenswrapper[4756]: I0318 14:40:04.975094 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564080-g6g8x" Mar 18 14:40:05 crc kubenswrapper[4756]: E0318 14:40:05.207281 4756 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5937f0b_1f51_4d2a_80ed_bbcb89dc4a5a.slice\": RecentStats: unable to find data in memory cache]" Mar 18 14:40:05 crc kubenswrapper[4756]: I0318 14:40:05.410870 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564074-vt6tr"] Mar 18 14:40:05 crc kubenswrapper[4756]: I0318 14:40:05.420005 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564074-vt6tr"] Mar 18 14:40:07 crc kubenswrapper[4756]: I0318 14:40:07.331201 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="363e90c7-2daa-4777-bb42-3d82405bedff" path="/var/lib/kubelet/pods/363e90c7-2daa-4777-bb42-3d82405bedff/volumes" Mar 18 14:40:33 crc kubenswrapper[4756]: I0318 14:40:33.383176 4756 scope.go:117] "RemoveContainer" containerID="95b204db1c49842f6030ea424a910543abb9fcb7b6bf51ede615bac1e42ec729" Mar 18 14:40:36 crc kubenswrapper[4756]: I0318 14:40:36.915525 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:40:36 crc kubenswrapper[4756]: I0318 14:40:36.916267 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:41:06 crc kubenswrapper[4756]: I0318 14:41:06.914810 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:41:06 crc kubenswrapper[4756]: I0318 14:41:06.915521 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:41:31 crc kubenswrapper[4756]: I0318 14:41:31.963261 4756 generic.go:334] "Generic (PLEG): container finished" podID="2a04f637-4e7d-4efc-8fc0-ce511f450960" containerID="88a47deb3e11a162647b9f55a481a055d711eef8dbd65fc3b3988d58ae5cb7dd" exitCode=0 Mar 18 14:41:31 crc kubenswrapper[4756]: I0318 14:41:31.963335 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" event={"ID":"2a04f637-4e7d-4efc-8fc0-ce511f450960","Type":"ContainerDied","Data":"88a47deb3e11a162647b9f55a481a055d711eef8dbd65fc3b3988d58ae5cb7dd"} Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.448854 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.619270 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-ssh-key-openstack-edpm-ipam\") pod \"2a04f637-4e7d-4efc-8fc0-ce511f450960\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.619593 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-libvirt-combined-ca-bundle\") pod \"2a04f637-4e7d-4efc-8fc0-ce511f450960\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.619650 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-inventory\") pod \"2a04f637-4e7d-4efc-8fc0-ce511f450960\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.619817 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b2bj\" (UniqueName: \"kubernetes.io/projected/2a04f637-4e7d-4efc-8fc0-ce511f450960-kube-api-access-9b2bj\") pod \"2a04f637-4e7d-4efc-8fc0-ce511f450960\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.619841 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-libvirt-secret-0\") pod \"2a04f637-4e7d-4efc-8fc0-ce511f450960\" (UID: \"2a04f637-4e7d-4efc-8fc0-ce511f450960\") " Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.635321 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a04f637-4e7d-4efc-8fc0-ce511f450960-kube-api-access-9b2bj" (OuterVolumeSpecName: "kube-api-access-9b2bj") pod "2a04f637-4e7d-4efc-8fc0-ce511f450960" (UID: "2a04f637-4e7d-4efc-8fc0-ce511f450960"). InnerVolumeSpecName "kube-api-access-9b2bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.635557 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2a04f637-4e7d-4efc-8fc0-ce511f450960" (UID: "2a04f637-4e7d-4efc-8fc0-ce511f450960"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.655136 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "2a04f637-4e7d-4efc-8fc0-ce511f450960" (UID: "2a04f637-4e7d-4efc-8fc0-ce511f450960"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.656433 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-inventory" (OuterVolumeSpecName: "inventory") pod "2a04f637-4e7d-4efc-8fc0-ce511f450960" (UID: "2a04f637-4e7d-4efc-8fc0-ce511f450960"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.674890 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2a04f637-4e7d-4efc-8fc0-ce511f450960" (UID: "2a04f637-4e7d-4efc-8fc0-ce511f450960"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.722995 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.723038 4756 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.723051 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.723064 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b2bj\" (UniqueName: \"kubernetes.io/projected/2a04f637-4e7d-4efc-8fc0-ce511f450960-kube-api-access-9b2bj\") on node \"crc\" DevicePath \"\"" Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.723076 4756 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2a04f637-4e7d-4efc-8fc0-ce511f450960-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.989877 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" event={"ID":"2a04f637-4e7d-4efc-8fc0-ce511f450960","Type":"ContainerDied","Data":"d7dc9223db5733eb0779ad6918d046f49b1056a10f4f7b1cc815a07f3c6fc947"} Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.989925 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7dc9223db5733eb0779ad6918d046f49b1056a10f4f7b1cc815a07f3c6fc947" Mar 18 14:41:33 crc kubenswrapper[4756]: I0318 14:41:33.989947 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xft9f" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.128919 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h"] Mar 18 14:41:34 crc kubenswrapper[4756]: E0318 14:41:34.129482 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a" containerName="oc" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.129505 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a" containerName="oc" Mar 18 14:41:34 crc kubenswrapper[4756]: E0318 14:41:34.129556 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a04f637-4e7d-4efc-8fc0-ce511f450960" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.129569 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a04f637-4e7d-4efc-8fc0-ce511f450960" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.129823 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a04f637-4e7d-4efc-8fc0-ce511f450960" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.129845 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a" containerName="oc" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.130804 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.133936 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.134174 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.134390 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.134526 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.134557 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.136414 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.140825 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.141866 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h"] Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.235037 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.235105 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.235155 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.235189 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.235220 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdjzc\" (UniqueName: \"kubernetes.io/projected/ddf30b37-8904-4a9f-8b73-8afe413c778b-kube-api-access-gdjzc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.235257 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.235359 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.235403 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.235435 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.235513 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.235585 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.337441 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.337884 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.337915 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.337945 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.337974 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdjzc\" (UniqueName: \"kubernetes.io/projected/ddf30b37-8904-4a9f-8b73-8afe413c778b-kube-api-access-gdjzc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.338007 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.338093 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.338168 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.338220 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.338253 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.338286 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.339888 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.343787 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.344251 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.344696 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.344932 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.345450 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.345674 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.345917 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.346941 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.350313 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.359890 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdjzc\" (UniqueName: \"kubernetes.io/projected/ddf30b37-8904-4a9f-8b73-8afe413c778b-kube-api-access-gdjzc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nvw5h\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:34 crc kubenswrapper[4756]: I0318 14:41:34.449335 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:41:35 crc kubenswrapper[4756]: I0318 14:41:35.058617 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h"] Mar 18 14:41:36 crc kubenswrapper[4756]: I0318 14:41:36.028556 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" event={"ID":"ddf30b37-8904-4a9f-8b73-8afe413c778b","Type":"ContainerStarted","Data":"176a00fff0986673217d5070e43a7f757d3d298f9d3f5a395a03d91a070f4473"} Mar 18 14:41:36 crc kubenswrapper[4756]: I0318 14:41:36.059527 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" podStartSLOduration=1.359209101 podStartE2EDuration="2.059508358s" podCreationTimestamp="2026-03-18 14:41:34 +0000 UTC" firstStartedPulling="2026-03-18 14:41:35.071608068 +0000 UTC m=+2496.386026043" lastFinishedPulling="2026-03-18 14:41:35.771907325 +0000 UTC m=+2497.086325300" observedRunningTime="2026-03-18 14:41:36.050167866 +0000 UTC m=+2497.364585861" watchObservedRunningTime="2026-03-18 14:41:36.059508358 +0000 UTC m=+2497.373926333" Mar 18 14:41:36 crc kubenswrapper[4756]: I0318 14:41:36.915934 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:41:36 crc kubenswrapper[4756]: I0318 14:41:36.916261 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:41:36 crc kubenswrapper[4756]: I0318 14:41:36.916319 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:41:36 crc kubenswrapper[4756]: I0318 14:41:36.917279 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:41:36 crc kubenswrapper[4756]: I0318 14:41:36.917380 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" gracePeriod=600 Mar 18 14:41:37 crc kubenswrapper[4756]: E0318 14:41:37.042502 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:41:37 crc kubenswrapper[4756]: I0318 14:41:37.047238 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" exitCode=0 Mar 18 14:41:37 crc kubenswrapper[4756]: I0318 14:41:37.047357 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b"} Mar 18 14:41:37 crc kubenswrapper[4756]: I0318 14:41:37.047491 4756 scope.go:117] "RemoveContainer" containerID="47fe16283e653d97b821ea41888e12ff278529bb4d88752b5670809f30122a6d" Mar 18 14:41:37 crc kubenswrapper[4756]: I0318 14:41:37.052840 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" event={"ID":"ddf30b37-8904-4a9f-8b73-8afe413c778b","Type":"ContainerStarted","Data":"24ef4dd3b69b079cd7c32b4c3cd986c034bdc6b1d428f2e41632f1434827d43e"} Mar 18 14:41:38 crc kubenswrapper[4756]: I0318 14:41:38.073026 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:41:38 crc kubenswrapper[4756]: E0318 14:41:38.074032 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:41:53 crc kubenswrapper[4756]: I0318 14:41:53.315832 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:41:53 crc kubenswrapper[4756]: E0318 14:41:53.317181 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:42:00 crc kubenswrapper[4756]: I0318 14:42:00.147193 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564082-njvsw"] Mar 18 14:42:00 crc kubenswrapper[4756]: I0318 14:42:00.149406 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564082-njvsw" Mar 18 14:42:00 crc kubenswrapper[4756]: I0318 14:42:00.151942 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:42:00 crc kubenswrapper[4756]: I0318 14:42:00.152525 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:42:00 crc kubenswrapper[4756]: I0318 14:42:00.152837 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:42:00 crc kubenswrapper[4756]: I0318 14:42:00.164411 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564082-njvsw"] Mar 18 14:42:00 crc kubenswrapper[4756]: I0318 14:42:00.248855 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7q6s\" (UniqueName: \"kubernetes.io/projected/140be45f-1eab-4833-b001-7e7aec24337a-kube-api-access-b7q6s\") pod \"auto-csr-approver-29564082-njvsw\" (UID: \"140be45f-1eab-4833-b001-7e7aec24337a\") " pod="openshift-infra/auto-csr-approver-29564082-njvsw" Mar 18 14:42:00 crc kubenswrapper[4756]: I0318 14:42:00.353034 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7q6s\" (UniqueName: \"kubernetes.io/projected/140be45f-1eab-4833-b001-7e7aec24337a-kube-api-access-b7q6s\") pod \"auto-csr-approver-29564082-njvsw\" (UID: \"140be45f-1eab-4833-b001-7e7aec24337a\") " pod="openshift-infra/auto-csr-approver-29564082-njvsw" Mar 18 14:42:00 crc kubenswrapper[4756]: I0318 14:42:00.375385 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7q6s\" (UniqueName: \"kubernetes.io/projected/140be45f-1eab-4833-b001-7e7aec24337a-kube-api-access-b7q6s\") pod \"auto-csr-approver-29564082-njvsw\" (UID: \"140be45f-1eab-4833-b001-7e7aec24337a\") " pod="openshift-infra/auto-csr-approver-29564082-njvsw" Mar 18 14:42:00 crc kubenswrapper[4756]: I0318 14:42:00.478504 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564082-njvsw" Mar 18 14:42:00 crc kubenswrapper[4756]: I0318 14:42:00.905680 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564082-njvsw"] Mar 18 14:42:00 crc kubenswrapper[4756]: I0318 14:42:00.908629 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:42:01 crc kubenswrapper[4756]: I0318 14:42:01.344028 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564082-njvsw" event={"ID":"140be45f-1eab-4833-b001-7e7aec24337a","Type":"ContainerStarted","Data":"af960ad68adee1fee5d74953cf200a8fdc21d8bb5251bd0a9a06bec84b74c4ba"} Mar 18 14:42:02 crc kubenswrapper[4756]: I0318 14:42:02.356134 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564082-njvsw" event={"ID":"140be45f-1eab-4833-b001-7e7aec24337a","Type":"ContainerStarted","Data":"585ca17760858d576feb1fc837f5f8baf35f4e9d3ff6fe90858ac7322134412a"} Mar 18 14:42:02 crc kubenswrapper[4756]: I0318 14:42:02.371708 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564082-njvsw" podStartSLOduration=1.249491461 podStartE2EDuration="2.371687991s" podCreationTimestamp="2026-03-18 14:42:00 +0000 UTC" firstStartedPulling="2026-03-18 14:42:00.908407762 +0000 UTC m=+2522.222825737" lastFinishedPulling="2026-03-18 14:42:02.030604252 +0000 UTC m=+2523.345022267" observedRunningTime="2026-03-18 14:42:02.370063566 +0000 UTC m=+2523.684481581" watchObservedRunningTime="2026-03-18 14:42:02.371687991 +0000 UTC m=+2523.686105976" Mar 18 14:42:03 crc kubenswrapper[4756]: I0318 14:42:03.368778 4756 generic.go:334] "Generic (PLEG): container finished" podID="140be45f-1eab-4833-b001-7e7aec24337a" containerID="585ca17760858d576feb1fc837f5f8baf35f4e9d3ff6fe90858ac7322134412a" exitCode=0 Mar 18 14:42:03 crc kubenswrapper[4756]: I0318 14:42:03.368904 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564082-njvsw" event={"ID":"140be45f-1eab-4833-b001-7e7aec24337a","Type":"ContainerDied","Data":"585ca17760858d576feb1fc837f5f8baf35f4e9d3ff6fe90858ac7322134412a"} Mar 18 14:42:04 crc kubenswrapper[4756]: I0318 14:42:04.789361 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564082-njvsw" Mar 18 14:42:04 crc kubenswrapper[4756]: I0318 14:42:04.962175 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7q6s\" (UniqueName: \"kubernetes.io/projected/140be45f-1eab-4833-b001-7e7aec24337a-kube-api-access-b7q6s\") pod \"140be45f-1eab-4833-b001-7e7aec24337a\" (UID: \"140be45f-1eab-4833-b001-7e7aec24337a\") " Mar 18 14:42:04 crc kubenswrapper[4756]: I0318 14:42:04.969514 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/140be45f-1eab-4833-b001-7e7aec24337a-kube-api-access-b7q6s" (OuterVolumeSpecName: "kube-api-access-b7q6s") pod "140be45f-1eab-4833-b001-7e7aec24337a" (UID: "140be45f-1eab-4833-b001-7e7aec24337a"). InnerVolumeSpecName "kube-api-access-b7q6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:42:05 crc kubenswrapper[4756]: I0318 14:42:05.065912 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7q6s\" (UniqueName: \"kubernetes.io/projected/140be45f-1eab-4833-b001-7e7aec24337a-kube-api-access-b7q6s\") on node \"crc\" DevicePath \"\"" Mar 18 14:42:05 crc kubenswrapper[4756]: I0318 14:42:05.393456 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564082-njvsw" event={"ID":"140be45f-1eab-4833-b001-7e7aec24337a","Type":"ContainerDied","Data":"af960ad68adee1fee5d74953cf200a8fdc21d8bb5251bd0a9a06bec84b74c4ba"} Mar 18 14:42:05 crc kubenswrapper[4756]: I0318 14:42:05.393520 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af960ad68adee1fee5d74953cf200a8fdc21d8bb5251bd0a9a06bec84b74c4ba" Mar 18 14:42:05 crc kubenswrapper[4756]: I0318 14:42:05.393549 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564082-njvsw" Mar 18 14:42:05 crc kubenswrapper[4756]: I0318 14:42:05.461002 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564076-nnvk9"] Mar 18 14:42:05 crc kubenswrapper[4756]: I0318 14:42:05.472996 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564076-nnvk9"] Mar 18 14:42:07 crc kubenswrapper[4756]: I0318 14:42:07.329348 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f579b3d-c832-4988-895b-5545dc5c857d" path="/var/lib/kubelet/pods/5f579b3d-c832-4988-895b-5545dc5c857d/volumes" Mar 18 14:42:08 crc kubenswrapper[4756]: I0318 14:42:08.316066 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:42:08 crc kubenswrapper[4756]: E0318 14:42:08.316467 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:42:19 crc kubenswrapper[4756]: I0318 14:42:19.325950 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:42:19 crc kubenswrapper[4756]: E0318 14:42:19.326833 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:42:30 crc kubenswrapper[4756]: I0318 14:42:30.316763 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:42:30 crc kubenswrapper[4756]: E0318 14:42:30.317962 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:42:33 crc kubenswrapper[4756]: I0318 14:42:33.504866 4756 scope.go:117] "RemoveContainer" containerID="610edfeeee407244e009b8349bd088df4b0075a56274fa3bec6087e147e8e882" Mar 18 14:42:43 crc kubenswrapper[4756]: I0318 14:42:43.316759 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:42:43 crc kubenswrapper[4756]: E0318 14:42:43.317891 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:42:55 crc kubenswrapper[4756]: I0318 14:42:55.315825 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:42:55 crc kubenswrapper[4756]: E0318 14:42:55.316646 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:43:09 crc kubenswrapper[4756]: I0318 14:43:09.322572 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:43:09 crc kubenswrapper[4756]: E0318 14:43:09.323534 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:43:20 crc kubenswrapper[4756]: I0318 14:43:20.316099 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:43:20 crc kubenswrapper[4756]: E0318 14:43:20.316903 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:43:34 crc kubenswrapper[4756]: I0318 14:43:34.315707 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:43:34 crc kubenswrapper[4756]: E0318 14:43:34.316704 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:43:49 crc kubenswrapper[4756]: I0318 14:43:49.321854 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:43:49 crc kubenswrapper[4756]: E0318 14:43:49.322666 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:43:55 crc kubenswrapper[4756]: I0318 14:43:55.502626 4756 generic.go:334] "Generic (PLEG): container finished" podID="ddf30b37-8904-4a9f-8b73-8afe413c778b" containerID="24ef4dd3b69b079cd7c32b4c3cd986c034bdc6b1d428f2e41632f1434827d43e" exitCode=0 Mar 18 14:43:55 crc kubenswrapper[4756]: I0318 14:43:55.502732 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" event={"ID":"ddf30b37-8904-4a9f-8b73-8afe413c778b","Type":"ContainerDied","Data":"24ef4dd3b69b079cd7c32b4c3cd986c034bdc6b1d428f2e41632f1434827d43e"} Mar 18 14:43:56 crc kubenswrapper[4756]: I0318 14:43:56.972196 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.111204 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-2\") pod \"ddf30b37-8904-4a9f-8b73-8afe413c778b\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.111262 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-extra-config-0\") pod \"ddf30b37-8904-4a9f-8b73-8afe413c778b\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.111313 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-migration-ssh-key-1\") pod \"ddf30b37-8904-4a9f-8b73-8afe413c778b\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.111360 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-0\") pod \"ddf30b37-8904-4a9f-8b73-8afe413c778b\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.111402 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-combined-ca-bundle\") pod \"ddf30b37-8904-4a9f-8b73-8afe413c778b\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.111428 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-migration-ssh-key-0\") pod \"ddf30b37-8904-4a9f-8b73-8afe413c778b\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.111468 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-1\") pod \"ddf30b37-8904-4a9f-8b73-8afe413c778b\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.111508 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-ssh-key-openstack-edpm-ipam\") pod \"ddf30b37-8904-4a9f-8b73-8afe413c778b\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.111532 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-inventory\") pod \"ddf30b37-8904-4a9f-8b73-8afe413c778b\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.111557 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdjzc\" (UniqueName: \"kubernetes.io/projected/ddf30b37-8904-4a9f-8b73-8afe413c778b-kube-api-access-gdjzc\") pod \"ddf30b37-8904-4a9f-8b73-8afe413c778b\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.111674 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-3\") pod \"ddf30b37-8904-4a9f-8b73-8afe413c778b\" (UID: \"ddf30b37-8904-4a9f-8b73-8afe413c778b\") " Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.119558 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf30b37-8904-4a9f-8b73-8afe413c778b-kube-api-access-gdjzc" (OuterVolumeSpecName: "kube-api-access-gdjzc") pod "ddf30b37-8904-4a9f-8b73-8afe413c778b" (UID: "ddf30b37-8904-4a9f-8b73-8afe413c778b"). InnerVolumeSpecName "kube-api-access-gdjzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.120399 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "ddf30b37-8904-4a9f-8b73-8afe413c778b" (UID: "ddf30b37-8904-4a9f-8b73-8afe413c778b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.147755 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "ddf30b37-8904-4a9f-8b73-8afe413c778b" (UID: "ddf30b37-8904-4a9f-8b73-8afe413c778b"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.148530 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "ddf30b37-8904-4a9f-8b73-8afe413c778b" (UID: "ddf30b37-8904-4a9f-8b73-8afe413c778b"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.148999 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "ddf30b37-8904-4a9f-8b73-8afe413c778b" (UID: "ddf30b37-8904-4a9f-8b73-8afe413c778b"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.150605 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "ddf30b37-8904-4a9f-8b73-8afe413c778b" (UID: "ddf30b37-8904-4a9f-8b73-8afe413c778b"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.151411 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-inventory" (OuterVolumeSpecName: "inventory") pod "ddf30b37-8904-4a9f-8b73-8afe413c778b" (UID: "ddf30b37-8904-4a9f-8b73-8afe413c778b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.151592 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "ddf30b37-8904-4a9f-8b73-8afe413c778b" (UID: "ddf30b37-8904-4a9f-8b73-8afe413c778b"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.152690 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "ddf30b37-8904-4a9f-8b73-8afe413c778b" (UID: "ddf30b37-8904-4a9f-8b73-8afe413c778b"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.156956 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ddf30b37-8904-4a9f-8b73-8afe413c778b" (UID: "ddf30b37-8904-4a9f-8b73-8afe413c778b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.173473 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "ddf30b37-8904-4a9f-8b73-8afe413c778b" (UID: "ddf30b37-8904-4a9f-8b73-8afe413c778b"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.215428 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.215480 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.215494 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdjzc\" (UniqueName: \"kubernetes.io/projected/ddf30b37-8904-4a9f-8b73-8afe413c778b-kube-api-access-gdjzc\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.215508 4756 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.215521 4756 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.215536 4756 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.215548 4756 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.215560 4756 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.215574 4756 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.215586 4756 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.215597 4756 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ddf30b37-8904-4a9f-8b73-8afe413c778b-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.524762 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" event={"ID":"ddf30b37-8904-4a9f-8b73-8afe413c778b","Type":"ContainerDied","Data":"176a00fff0986673217d5070e43a7f757d3d298f9d3f5a395a03d91a070f4473"} Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.525044 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="176a00fff0986673217d5070e43a7f757d3d298f9d3f5a395a03d91a070f4473" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.524812 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nvw5h" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.674186 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c"] Mar 18 14:43:57 crc kubenswrapper[4756]: E0318 14:43:57.674650 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="140be45f-1eab-4833-b001-7e7aec24337a" containerName="oc" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.674666 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="140be45f-1eab-4833-b001-7e7aec24337a" containerName="oc" Mar 18 14:43:57 crc kubenswrapper[4756]: E0318 14:43:57.674687 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf30b37-8904-4a9f-8b73-8afe413c778b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.674693 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf30b37-8904-4a9f-8b73-8afe413c778b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.674908 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddf30b37-8904-4a9f-8b73-8afe413c778b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.674925 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="140be45f-1eab-4833-b001-7e7aec24337a" containerName="oc" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.675657 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.684600 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.685479 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.684881 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.684934 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h2zf6" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.685785 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.686381 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c"] Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.833481 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.833589 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.833610 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.833630 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.833769 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.833937 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lxt8\" (UniqueName: \"kubernetes.io/projected/c5555558-89a8-4faa-aeb3-0ee1110796be-kube-api-access-5lxt8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.834136 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.935861 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.935962 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.935990 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.936017 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.936048 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.936112 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lxt8\" (UniqueName: \"kubernetes.io/projected/c5555558-89a8-4faa-aeb3-0ee1110796be-kube-api-access-5lxt8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.936225 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.941020 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.941097 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.941477 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.942482 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.943183 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.943842 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:57 crc kubenswrapper[4756]: I0318 14:43:57.967790 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lxt8\" (UniqueName: \"kubernetes.io/projected/c5555558-89a8-4faa-aeb3-0ee1110796be-kube-api-access-5lxt8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:58 crc kubenswrapper[4756]: I0318 14:43:58.001956 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:43:58 crc kubenswrapper[4756]: I0318 14:43:58.553689 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c"] Mar 18 14:43:59 crc kubenswrapper[4756]: I0318 14:43:59.544198 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" event={"ID":"c5555558-89a8-4faa-aeb3-0ee1110796be","Type":"ContainerStarted","Data":"8f397c494b94cdf06403213a34d2346d6d4e5ff955a740c07bb5f36367dac8bf"} Mar 18 14:43:59 crc kubenswrapper[4756]: I0318 14:43:59.544736 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" event={"ID":"c5555558-89a8-4faa-aeb3-0ee1110796be","Type":"ContainerStarted","Data":"c3fb57ac4f51976c83b5d88d9c7935a6d00440b1028f4840af1574694af30eb8"} Mar 18 14:43:59 crc kubenswrapper[4756]: I0318 14:43:59.573341 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" podStartSLOduration=1.978264041 podStartE2EDuration="2.573324394s" podCreationTimestamp="2026-03-18 14:43:57 +0000 UTC" firstStartedPulling="2026-03-18 14:43:58.561478306 +0000 UTC m=+2639.875896281" lastFinishedPulling="2026-03-18 14:43:59.156538659 +0000 UTC m=+2640.470956634" observedRunningTime="2026-03-18 14:43:59.564209216 +0000 UTC m=+2640.878627191" watchObservedRunningTime="2026-03-18 14:43:59.573324394 +0000 UTC m=+2640.887742369" Mar 18 14:44:00 crc kubenswrapper[4756]: I0318 14:44:00.150822 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564084-8vqbb"] Mar 18 14:44:00 crc kubenswrapper[4756]: I0318 14:44:00.152860 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564084-8vqbb" Mar 18 14:44:00 crc kubenswrapper[4756]: I0318 14:44:00.155456 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:44:00 crc kubenswrapper[4756]: I0318 14:44:00.155991 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:44:00 crc kubenswrapper[4756]: I0318 14:44:00.156167 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:44:00 crc kubenswrapper[4756]: I0318 14:44:00.159967 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564084-8vqbb"] Mar 18 14:44:00 crc kubenswrapper[4756]: I0318 14:44:00.296274 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvj7w\" (UniqueName: \"kubernetes.io/projected/3690f204-953e-4a6d-8aa3-4f7f0fa5ff63-kube-api-access-nvj7w\") pod \"auto-csr-approver-29564084-8vqbb\" (UID: \"3690f204-953e-4a6d-8aa3-4f7f0fa5ff63\") " pod="openshift-infra/auto-csr-approver-29564084-8vqbb" Mar 18 14:44:00 crc kubenswrapper[4756]: I0318 14:44:00.316789 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:44:00 crc kubenswrapper[4756]: E0318 14:44:00.317039 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:44:00 crc kubenswrapper[4756]: I0318 14:44:00.398968 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvj7w\" (UniqueName: \"kubernetes.io/projected/3690f204-953e-4a6d-8aa3-4f7f0fa5ff63-kube-api-access-nvj7w\") pod \"auto-csr-approver-29564084-8vqbb\" (UID: \"3690f204-953e-4a6d-8aa3-4f7f0fa5ff63\") " pod="openshift-infra/auto-csr-approver-29564084-8vqbb" Mar 18 14:44:00 crc kubenswrapper[4756]: I0318 14:44:00.425186 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvj7w\" (UniqueName: \"kubernetes.io/projected/3690f204-953e-4a6d-8aa3-4f7f0fa5ff63-kube-api-access-nvj7w\") pod \"auto-csr-approver-29564084-8vqbb\" (UID: \"3690f204-953e-4a6d-8aa3-4f7f0fa5ff63\") " pod="openshift-infra/auto-csr-approver-29564084-8vqbb" Mar 18 14:44:00 crc kubenswrapper[4756]: I0318 14:44:00.481344 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564084-8vqbb" Mar 18 14:44:00 crc kubenswrapper[4756]: I0318 14:44:00.954857 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564084-8vqbb"] Mar 18 14:44:01 crc kubenswrapper[4756]: I0318 14:44:01.566958 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564084-8vqbb" event={"ID":"3690f204-953e-4a6d-8aa3-4f7f0fa5ff63","Type":"ContainerStarted","Data":"5efc42533629cc5969593f4e45ba15396257c300b53dcde7455e6a5fad1d4afe"} Mar 18 14:44:02 crc kubenswrapper[4756]: I0318 14:44:02.576248 4756 generic.go:334] "Generic (PLEG): container finished" podID="3690f204-953e-4a6d-8aa3-4f7f0fa5ff63" containerID="a6130cb9309217eba359a4e6de2fcfef5ce0201c410f97e19ca55f4e0b33f8e4" exitCode=0 Mar 18 14:44:02 crc kubenswrapper[4756]: I0318 14:44:02.576337 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564084-8vqbb" event={"ID":"3690f204-953e-4a6d-8aa3-4f7f0fa5ff63","Type":"ContainerDied","Data":"a6130cb9309217eba359a4e6de2fcfef5ce0201c410f97e19ca55f4e0b33f8e4"} Mar 18 14:44:03 crc kubenswrapper[4756]: I0318 14:44:03.980384 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564084-8vqbb" Mar 18 14:44:03 crc kubenswrapper[4756]: I0318 14:44:03.984589 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvj7w\" (UniqueName: \"kubernetes.io/projected/3690f204-953e-4a6d-8aa3-4f7f0fa5ff63-kube-api-access-nvj7w\") pod \"3690f204-953e-4a6d-8aa3-4f7f0fa5ff63\" (UID: \"3690f204-953e-4a6d-8aa3-4f7f0fa5ff63\") " Mar 18 14:44:03 crc kubenswrapper[4756]: I0318 14:44:03.991871 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3690f204-953e-4a6d-8aa3-4f7f0fa5ff63-kube-api-access-nvj7w" (OuterVolumeSpecName: "kube-api-access-nvj7w") pod "3690f204-953e-4a6d-8aa3-4f7f0fa5ff63" (UID: "3690f204-953e-4a6d-8aa3-4f7f0fa5ff63"). InnerVolumeSpecName "kube-api-access-nvj7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:44:04 crc kubenswrapper[4756]: I0318 14:44:04.087286 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvj7w\" (UniqueName: \"kubernetes.io/projected/3690f204-953e-4a6d-8aa3-4f7f0fa5ff63-kube-api-access-nvj7w\") on node \"crc\" DevicePath \"\"" Mar 18 14:44:04 crc kubenswrapper[4756]: I0318 14:44:04.599193 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564084-8vqbb" event={"ID":"3690f204-953e-4a6d-8aa3-4f7f0fa5ff63","Type":"ContainerDied","Data":"5efc42533629cc5969593f4e45ba15396257c300b53dcde7455e6a5fad1d4afe"} Mar 18 14:44:04 crc kubenswrapper[4756]: I0318 14:44:04.599233 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5efc42533629cc5969593f4e45ba15396257c300b53dcde7455e6a5fad1d4afe" Mar 18 14:44:04 crc kubenswrapper[4756]: I0318 14:44:04.599248 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564084-8vqbb" Mar 18 14:44:05 crc kubenswrapper[4756]: I0318 14:44:05.076249 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564078-hh6wl"] Mar 18 14:44:05 crc kubenswrapper[4756]: I0318 14:44:05.093524 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564078-hh6wl"] Mar 18 14:44:05 crc kubenswrapper[4756]: I0318 14:44:05.330207 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daa2530d-df7e-4e7d-ae6e-880282674a28" path="/var/lib/kubelet/pods/daa2530d-df7e-4e7d-ae6e-880282674a28/volumes" Mar 18 14:44:14 crc kubenswrapper[4756]: I0318 14:44:14.315528 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:44:14 crc kubenswrapper[4756]: E0318 14:44:14.316237 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:44:29 crc kubenswrapper[4756]: I0318 14:44:29.315840 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:44:29 crc kubenswrapper[4756]: E0318 14:44:29.316965 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:44:33 crc kubenswrapper[4756]: I0318 14:44:33.607523 4756 scope.go:117] "RemoveContainer" containerID="96efdf452898e5c0f6e3407dbd835ebd1df397036229e13aebe150374721e5d9" Mar 18 14:44:44 crc kubenswrapper[4756]: I0318 14:44:44.315881 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:44:44 crc kubenswrapper[4756]: E0318 14:44:44.316702 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:44:59 crc kubenswrapper[4756]: I0318 14:44:59.324078 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:44:59 crc kubenswrapper[4756]: E0318 14:44:59.324946 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.176917 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5"] Mar 18 14:45:00 crc kubenswrapper[4756]: E0318 14:45:00.177399 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3690f204-953e-4a6d-8aa3-4f7f0fa5ff63" containerName="oc" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.177420 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3690f204-953e-4a6d-8aa3-4f7f0fa5ff63" containerName="oc" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.177636 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3690f204-953e-4a6d-8aa3-4f7f0fa5ff63" containerName="oc" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.178411 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.182978 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.183524 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.188919 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5"] Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.283937 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eb2ac90-0345-4649-9c81-c768097546ae-secret-volume\") pod \"collect-profiles-29564085-fggg5\" (UID: \"3eb2ac90-0345-4649-9c81-c768097546ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.284470 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb6f7\" (UniqueName: \"kubernetes.io/projected/3eb2ac90-0345-4649-9c81-c768097546ae-kube-api-access-rb6f7\") pod \"collect-profiles-29564085-fggg5\" (UID: \"3eb2ac90-0345-4649-9c81-c768097546ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.284578 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eb2ac90-0345-4649-9c81-c768097546ae-config-volume\") pod \"collect-profiles-29564085-fggg5\" (UID: \"3eb2ac90-0345-4649-9c81-c768097546ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.388092 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eb2ac90-0345-4649-9c81-c768097546ae-secret-volume\") pod \"collect-profiles-29564085-fggg5\" (UID: \"3eb2ac90-0345-4649-9c81-c768097546ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.388230 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb6f7\" (UniqueName: \"kubernetes.io/projected/3eb2ac90-0345-4649-9c81-c768097546ae-kube-api-access-rb6f7\") pod \"collect-profiles-29564085-fggg5\" (UID: \"3eb2ac90-0345-4649-9c81-c768097546ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.388379 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eb2ac90-0345-4649-9c81-c768097546ae-config-volume\") pod \"collect-profiles-29564085-fggg5\" (UID: \"3eb2ac90-0345-4649-9c81-c768097546ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.389492 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eb2ac90-0345-4649-9c81-c768097546ae-config-volume\") pod \"collect-profiles-29564085-fggg5\" (UID: \"3eb2ac90-0345-4649-9c81-c768097546ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.402155 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eb2ac90-0345-4649-9c81-c768097546ae-secret-volume\") pod \"collect-profiles-29564085-fggg5\" (UID: \"3eb2ac90-0345-4649-9c81-c768097546ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.406161 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb6f7\" (UniqueName: \"kubernetes.io/projected/3eb2ac90-0345-4649-9c81-c768097546ae-kube-api-access-rb6f7\") pod \"collect-profiles-29564085-fggg5\" (UID: \"3eb2ac90-0345-4649-9c81-c768097546ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.499918 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" Mar 18 14:45:00 crc kubenswrapper[4756]: I0318 14:45:00.980375 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5"] Mar 18 14:45:01 crc kubenswrapper[4756]: I0318 14:45:01.301090 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" event={"ID":"3eb2ac90-0345-4649-9c81-c768097546ae","Type":"ContainerStarted","Data":"713e89fa3a75cce186f99119803dc6d06c9f594743c38701388d01c59d032f62"} Mar 18 14:45:01 crc kubenswrapper[4756]: I0318 14:45:01.301545 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" event={"ID":"3eb2ac90-0345-4649-9c81-c768097546ae","Type":"ContainerStarted","Data":"b41a9b6e7a0b44a879a933c2b4c7d66c373af9ad1b31921941bdb261871a0d8d"} Mar 18 14:45:01 crc kubenswrapper[4756]: I0318 14:45:01.336970 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" podStartSLOduration=1.336938243 podStartE2EDuration="1.336938243s" podCreationTimestamp="2026-03-18 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:45:01.328051543 +0000 UTC m=+2702.642469508" watchObservedRunningTime="2026-03-18 14:45:01.336938243 +0000 UTC m=+2702.651356258" Mar 18 14:45:02 crc kubenswrapper[4756]: I0318 14:45:02.317746 4756 generic.go:334] "Generic (PLEG): container finished" podID="3eb2ac90-0345-4649-9c81-c768097546ae" containerID="713e89fa3a75cce186f99119803dc6d06c9f594743c38701388d01c59d032f62" exitCode=0 Mar 18 14:45:02 crc kubenswrapper[4756]: I0318 14:45:02.318312 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" event={"ID":"3eb2ac90-0345-4649-9c81-c768097546ae","Type":"ContainerDied","Data":"713e89fa3a75cce186f99119803dc6d06c9f594743c38701388d01c59d032f62"} Mar 18 14:45:03 crc kubenswrapper[4756]: I0318 14:45:03.737955 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" Mar 18 14:45:03 crc kubenswrapper[4756]: I0318 14:45:03.883791 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eb2ac90-0345-4649-9c81-c768097546ae-secret-volume\") pod \"3eb2ac90-0345-4649-9c81-c768097546ae\" (UID: \"3eb2ac90-0345-4649-9c81-c768097546ae\") " Mar 18 14:45:03 crc kubenswrapper[4756]: I0318 14:45:03.884207 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb6f7\" (UniqueName: \"kubernetes.io/projected/3eb2ac90-0345-4649-9c81-c768097546ae-kube-api-access-rb6f7\") pod \"3eb2ac90-0345-4649-9c81-c768097546ae\" (UID: \"3eb2ac90-0345-4649-9c81-c768097546ae\") " Mar 18 14:45:03 crc kubenswrapper[4756]: I0318 14:45:03.884335 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eb2ac90-0345-4649-9c81-c768097546ae-config-volume\") pod \"3eb2ac90-0345-4649-9c81-c768097546ae\" (UID: \"3eb2ac90-0345-4649-9c81-c768097546ae\") " Mar 18 14:45:03 crc kubenswrapper[4756]: I0318 14:45:03.884890 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eb2ac90-0345-4649-9c81-c768097546ae-config-volume" (OuterVolumeSpecName: "config-volume") pod "3eb2ac90-0345-4649-9c81-c768097546ae" (UID: "3eb2ac90-0345-4649-9c81-c768097546ae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:45:03 crc kubenswrapper[4756]: I0318 14:45:03.885142 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eb2ac90-0345-4649-9c81-c768097546ae-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:45:03 crc kubenswrapper[4756]: I0318 14:45:03.890692 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb2ac90-0345-4649-9c81-c768097546ae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3eb2ac90-0345-4649-9c81-c768097546ae" (UID: "3eb2ac90-0345-4649-9c81-c768097546ae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:45:03 crc kubenswrapper[4756]: I0318 14:45:03.890723 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb2ac90-0345-4649-9c81-c768097546ae-kube-api-access-rb6f7" (OuterVolumeSpecName: "kube-api-access-rb6f7") pod "3eb2ac90-0345-4649-9c81-c768097546ae" (UID: "3eb2ac90-0345-4649-9c81-c768097546ae"). InnerVolumeSpecName "kube-api-access-rb6f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:45:03 crc kubenswrapper[4756]: I0318 14:45:03.987213 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eb2ac90-0345-4649-9c81-c768097546ae-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:45:03 crc kubenswrapper[4756]: I0318 14:45:03.987248 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb6f7\" (UniqueName: \"kubernetes.io/projected/3eb2ac90-0345-4649-9c81-c768097546ae-kube-api-access-rb6f7\") on node \"crc\" DevicePath \"\"" Mar 18 14:45:04 crc kubenswrapper[4756]: I0318 14:45:04.341715 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" event={"ID":"3eb2ac90-0345-4649-9c81-c768097546ae","Type":"ContainerDied","Data":"b41a9b6e7a0b44a879a933c2b4c7d66c373af9ad1b31921941bdb261871a0d8d"} Mar 18 14:45:04 crc kubenswrapper[4756]: I0318 14:45:04.341790 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b41a9b6e7a0b44a879a933c2b4c7d66c373af9ad1b31921941bdb261871a0d8d" Mar 18 14:45:04 crc kubenswrapper[4756]: I0318 14:45:04.341832 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fggg5" Mar 18 14:45:04 crc kubenswrapper[4756]: I0318 14:45:04.418983 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw"] Mar 18 14:45:04 crc kubenswrapper[4756]: I0318 14:45:04.430383 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564040-5nxvw"] Mar 18 14:45:05 crc kubenswrapper[4756]: I0318 14:45:05.338458 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31db1d26-2d05-489e-9177-580797f8897c" path="/var/lib/kubelet/pods/31db1d26-2d05-489e-9177-580797f8897c/volumes" Mar 18 14:45:13 crc kubenswrapper[4756]: I0318 14:45:13.316578 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:45:13 crc kubenswrapper[4756]: E0318 14:45:13.317996 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:45:25 crc kubenswrapper[4756]: I0318 14:45:25.316873 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:45:25 crc kubenswrapper[4756]: E0318 14:45:25.318021 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:45:33 crc kubenswrapper[4756]: I0318 14:45:33.691180 4756 scope.go:117] "RemoveContainer" containerID="3bddb0496b08fed63d21de430db606e49d46cfdd6c4819ab0775c8c6e5033594" Mar 18 14:45:38 crc kubenswrapper[4756]: I0318 14:45:38.316554 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:45:38 crc kubenswrapper[4756]: E0318 14:45:38.317777 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:45:52 crc kubenswrapper[4756]: I0318 14:45:52.315591 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:45:52 crc kubenswrapper[4756]: E0318 14:45:52.316480 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:46:00 crc kubenswrapper[4756]: I0318 14:46:00.164835 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564086-mkgvh"] Mar 18 14:46:00 crc kubenswrapper[4756]: E0318 14:46:00.165956 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb2ac90-0345-4649-9c81-c768097546ae" containerName="collect-profiles" Mar 18 14:46:00 crc kubenswrapper[4756]: I0318 14:46:00.165974 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb2ac90-0345-4649-9c81-c768097546ae" containerName="collect-profiles" Mar 18 14:46:00 crc kubenswrapper[4756]: I0318 14:46:00.166225 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb2ac90-0345-4649-9c81-c768097546ae" containerName="collect-profiles" Mar 18 14:46:00 crc kubenswrapper[4756]: I0318 14:46:00.167152 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564086-mkgvh" Mar 18 14:46:00 crc kubenswrapper[4756]: I0318 14:46:00.169424 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:46:00 crc kubenswrapper[4756]: I0318 14:46:00.171175 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:46:00 crc kubenswrapper[4756]: I0318 14:46:00.171237 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:46:00 crc kubenswrapper[4756]: I0318 14:46:00.184737 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564086-mkgvh"] Mar 18 14:46:00 crc kubenswrapper[4756]: I0318 14:46:00.260090 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2rqg\" (UniqueName: \"kubernetes.io/projected/4cab5bbb-cbae-4857-b0c3-a19146b51fe0-kube-api-access-q2rqg\") pod \"auto-csr-approver-29564086-mkgvh\" (UID: \"4cab5bbb-cbae-4857-b0c3-a19146b51fe0\") " pod="openshift-infra/auto-csr-approver-29564086-mkgvh" Mar 18 14:46:00 crc kubenswrapper[4756]: I0318 14:46:00.362030 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2rqg\" (UniqueName: \"kubernetes.io/projected/4cab5bbb-cbae-4857-b0c3-a19146b51fe0-kube-api-access-q2rqg\") pod \"auto-csr-approver-29564086-mkgvh\" (UID: \"4cab5bbb-cbae-4857-b0c3-a19146b51fe0\") " pod="openshift-infra/auto-csr-approver-29564086-mkgvh" Mar 18 14:46:00 crc kubenswrapper[4756]: I0318 14:46:00.386264 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2rqg\" (UniqueName: \"kubernetes.io/projected/4cab5bbb-cbae-4857-b0c3-a19146b51fe0-kube-api-access-q2rqg\") pod \"auto-csr-approver-29564086-mkgvh\" (UID: \"4cab5bbb-cbae-4857-b0c3-a19146b51fe0\") " pod="openshift-infra/auto-csr-approver-29564086-mkgvh" Mar 18 14:46:00 crc kubenswrapper[4756]: I0318 14:46:00.502918 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564086-mkgvh" Mar 18 14:46:00 crc kubenswrapper[4756]: I0318 14:46:00.984095 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564086-mkgvh"] Mar 18 14:46:01 crc kubenswrapper[4756]: I0318 14:46:01.024473 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564086-mkgvh" event={"ID":"4cab5bbb-cbae-4857-b0c3-a19146b51fe0","Type":"ContainerStarted","Data":"592299d3b80fd38d0745d1679124c689532ff747d4d5b4cac1ada7c3f15ea2da"} Mar 18 14:46:03 crc kubenswrapper[4756]: I0318 14:46:03.063334 4756 generic.go:334] "Generic (PLEG): container finished" podID="4cab5bbb-cbae-4857-b0c3-a19146b51fe0" containerID="c6f80fc83b82efb2922aff8339f15c8f2ffc4c069440960cbc42bb19e7270fc7" exitCode=0 Mar 18 14:46:03 crc kubenswrapper[4756]: I0318 14:46:03.063418 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564086-mkgvh" event={"ID":"4cab5bbb-cbae-4857-b0c3-a19146b51fe0","Type":"ContainerDied","Data":"c6f80fc83b82efb2922aff8339f15c8f2ffc4c069440960cbc42bb19e7270fc7"} Mar 18 14:46:04 crc kubenswrapper[4756]: I0318 14:46:04.316368 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:46:04 crc kubenswrapper[4756]: E0318 14:46:04.317151 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:46:04 crc kubenswrapper[4756]: I0318 14:46:04.551258 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564086-mkgvh" Mar 18 14:46:04 crc kubenswrapper[4756]: I0318 14:46:04.659258 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2rqg\" (UniqueName: \"kubernetes.io/projected/4cab5bbb-cbae-4857-b0c3-a19146b51fe0-kube-api-access-q2rqg\") pod \"4cab5bbb-cbae-4857-b0c3-a19146b51fe0\" (UID: \"4cab5bbb-cbae-4857-b0c3-a19146b51fe0\") " Mar 18 14:46:04 crc kubenswrapper[4756]: I0318 14:46:04.666351 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cab5bbb-cbae-4857-b0c3-a19146b51fe0-kube-api-access-q2rqg" (OuterVolumeSpecName: "kube-api-access-q2rqg") pod "4cab5bbb-cbae-4857-b0c3-a19146b51fe0" (UID: "4cab5bbb-cbae-4857-b0c3-a19146b51fe0"). InnerVolumeSpecName "kube-api-access-q2rqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:46:04 crc kubenswrapper[4756]: I0318 14:46:04.761805 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2rqg\" (UniqueName: \"kubernetes.io/projected/4cab5bbb-cbae-4857-b0c3-a19146b51fe0-kube-api-access-q2rqg\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:05 crc kubenswrapper[4756]: I0318 14:46:05.097185 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564086-mkgvh" event={"ID":"4cab5bbb-cbae-4857-b0c3-a19146b51fe0","Type":"ContainerDied","Data":"592299d3b80fd38d0745d1679124c689532ff747d4d5b4cac1ada7c3f15ea2da"} Mar 18 14:46:05 crc kubenswrapper[4756]: I0318 14:46:05.097602 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="592299d3b80fd38d0745d1679124c689532ff747d4d5b4cac1ada7c3f15ea2da" Mar 18 14:46:05 crc kubenswrapper[4756]: I0318 14:46:05.097282 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564086-mkgvh" Mar 18 14:46:05 crc kubenswrapper[4756]: I0318 14:46:05.640840 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564080-g6g8x"] Mar 18 14:46:05 crc kubenswrapper[4756]: I0318 14:46:05.650236 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564080-g6g8x"] Mar 18 14:46:07 crc kubenswrapper[4756]: I0318 14:46:07.326073 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a" path="/var/lib/kubelet/pods/f5937f0b-1f51-4d2a-80ed-bbcb89dc4a5a/volumes" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.086482 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j7rkt"] Mar 18 14:46:13 crc kubenswrapper[4756]: E0318 14:46:13.087552 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cab5bbb-cbae-4857-b0c3-a19146b51fe0" containerName="oc" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.087569 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cab5bbb-cbae-4857-b0c3-a19146b51fe0" containerName="oc" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.087834 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cab5bbb-cbae-4857-b0c3-a19146b51fe0" containerName="oc" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.089713 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.140237 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j7rkt"] Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.158917 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg4ds\" (UniqueName: \"kubernetes.io/projected/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-kube-api-access-qg4ds\") pod \"redhat-operators-j7rkt\" (UID: \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\") " pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.159214 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-catalog-content\") pod \"redhat-operators-j7rkt\" (UID: \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\") " pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.159359 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-utilities\") pod \"redhat-operators-j7rkt\" (UID: \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\") " pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.261828 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-catalog-content\") pod \"redhat-operators-j7rkt\" (UID: \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\") " pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.261905 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-utilities\") pod \"redhat-operators-j7rkt\" (UID: \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\") " pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.262005 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg4ds\" (UniqueName: \"kubernetes.io/projected/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-kube-api-access-qg4ds\") pod \"redhat-operators-j7rkt\" (UID: \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\") " pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.262567 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-catalog-content\") pod \"redhat-operators-j7rkt\" (UID: \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\") " pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.262639 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-utilities\") pod \"redhat-operators-j7rkt\" (UID: \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\") " pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.292965 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg4ds\" (UniqueName: \"kubernetes.io/projected/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-kube-api-access-qg4ds\") pod \"redhat-operators-j7rkt\" (UID: \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\") " pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.418918 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:13 crc kubenswrapper[4756]: I0318 14:46:13.900440 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j7rkt"] Mar 18 14:46:14 crc kubenswrapper[4756]: I0318 14:46:14.223170 4756 generic.go:334] "Generic (PLEG): container finished" podID="0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" containerID="d0d51f8b8442556db706d6bc64b53160a9332d57f197db9b64cae98652dbbb86" exitCode=0 Mar 18 14:46:14 crc kubenswrapper[4756]: I0318 14:46:14.223276 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7rkt" event={"ID":"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884","Type":"ContainerDied","Data":"d0d51f8b8442556db706d6bc64b53160a9332d57f197db9b64cae98652dbbb86"} Mar 18 14:46:14 crc kubenswrapper[4756]: I0318 14:46:14.223445 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7rkt" event={"ID":"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884","Type":"ContainerStarted","Data":"05b4d351eac7cc1db93ff5bbc1112f7f560032f8d43a02efb123ba1153b77045"} Mar 18 14:46:16 crc kubenswrapper[4756]: I0318 14:46:16.247059 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7rkt" event={"ID":"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884","Type":"ContainerStarted","Data":"5f0a482f9a661a177b997a179426d47b169578ccc1f70a062f41ce9c081ee3c9"} Mar 18 14:46:17 crc kubenswrapper[4756]: I0318 14:46:17.315431 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:46:17 crc kubenswrapper[4756]: E0318 14:46:17.316033 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:46:21 crc kubenswrapper[4756]: I0318 14:46:21.301281 4756 generic.go:334] "Generic (PLEG): container finished" podID="0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" containerID="5f0a482f9a661a177b997a179426d47b169578ccc1f70a062f41ce9c081ee3c9" exitCode=0 Mar 18 14:46:21 crc kubenswrapper[4756]: I0318 14:46:21.301327 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7rkt" event={"ID":"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884","Type":"ContainerDied","Data":"5f0a482f9a661a177b997a179426d47b169578ccc1f70a062f41ce9c081ee3c9"} Mar 18 14:46:22 crc kubenswrapper[4756]: I0318 14:46:22.314971 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7rkt" event={"ID":"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884","Type":"ContainerStarted","Data":"ea3dbf9ea57d1bf9191f6e4469df781274267857c88cfb5adec4c5957cc602ab"} Mar 18 14:46:22 crc kubenswrapper[4756]: I0318 14:46:22.343015 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j7rkt" podStartSLOduration=1.825564631 podStartE2EDuration="9.342995832s" podCreationTimestamp="2026-03-18 14:46:13 +0000 UTC" firstStartedPulling="2026-03-18 14:46:14.224850005 +0000 UTC m=+2775.539267980" lastFinishedPulling="2026-03-18 14:46:21.742281206 +0000 UTC m=+2783.056699181" observedRunningTime="2026-03-18 14:46:22.335034257 +0000 UTC m=+2783.649452252" watchObservedRunningTime="2026-03-18 14:46:22.342995832 +0000 UTC m=+2783.657413807" Mar 18 14:46:23 crc kubenswrapper[4756]: I0318 14:46:23.419558 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:23 crc kubenswrapper[4756]: I0318 14:46:23.419850 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:24 crc kubenswrapper[4756]: I0318 14:46:24.464915 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j7rkt" podUID="0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" containerName="registry-server" probeResult="failure" output=< Mar 18 14:46:24 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:46:24 crc kubenswrapper[4756]: > Mar 18 14:46:25 crc kubenswrapper[4756]: I0318 14:46:25.344224 4756 generic.go:334] "Generic (PLEG): container finished" podID="c5555558-89a8-4faa-aeb3-0ee1110796be" containerID="8f397c494b94cdf06403213a34d2346d6d4e5ff955a740c07bb5f36367dac8bf" exitCode=0 Mar 18 14:46:25 crc kubenswrapper[4756]: I0318 14:46:25.345334 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" event={"ID":"c5555558-89a8-4faa-aeb3-0ee1110796be","Type":"ContainerDied","Data":"8f397c494b94cdf06403213a34d2346d6d4e5ff955a740c07bb5f36367dac8bf"} Mar 18 14:46:26 crc kubenswrapper[4756]: I0318 14:46:26.977507 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.164259 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-0\") pod \"c5555558-89a8-4faa-aeb3-0ee1110796be\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.164566 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-inventory\") pod \"c5555558-89a8-4faa-aeb3-0ee1110796be\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.164673 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-2\") pod \"c5555558-89a8-4faa-aeb3-0ee1110796be\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.164788 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ssh-key-openstack-edpm-ipam\") pod \"c5555558-89a8-4faa-aeb3-0ee1110796be\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.164885 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-telemetry-combined-ca-bundle\") pod \"c5555558-89a8-4faa-aeb3-0ee1110796be\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.165183 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-1\") pod \"c5555558-89a8-4faa-aeb3-0ee1110796be\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.165288 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lxt8\" (UniqueName: \"kubernetes.io/projected/c5555558-89a8-4faa-aeb3-0ee1110796be-kube-api-access-5lxt8\") pod \"c5555558-89a8-4faa-aeb3-0ee1110796be\" (UID: \"c5555558-89a8-4faa-aeb3-0ee1110796be\") " Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.171708 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5555558-89a8-4faa-aeb3-0ee1110796be-kube-api-access-5lxt8" (OuterVolumeSpecName: "kube-api-access-5lxt8") pod "c5555558-89a8-4faa-aeb3-0ee1110796be" (UID: "c5555558-89a8-4faa-aeb3-0ee1110796be"). InnerVolumeSpecName "kube-api-access-5lxt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.172148 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "c5555558-89a8-4faa-aeb3-0ee1110796be" (UID: "c5555558-89a8-4faa-aeb3-0ee1110796be"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.200483 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "c5555558-89a8-4faa-aeb3-0ee1110796be" (UID: "c5555558-89a8-4faa-aeb3-0ee1110796be"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.202005 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-inventory" (OuterVolumeSpecName: "inventory") pod "c5555558-89a8-4faa-aeb3-0ee1110796be" (UID: "c5555558-89a8-4faa-aeb3-0ee1110796be"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.202286 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "c5555558-89a8-4faa-aeb3-0ee1110796be" (UID: "c5555558-89a8-4faa-aeb3-0ee1110796be"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.202610 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "c5555558-89a8-4faa-aeb3-0ee1110796be" (UID: "c5555558-89a8-4faa-aeb3-0ee1110796be"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.204774 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c5555558-89a8-4faa-aeb3-0ee1110796be" (UID: "c5555558-89a8-4faa-aeb3-0ee1110796be"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.267299 4756 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.267334 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lxt8\" (UniqueName: \"kubernetes.io/projected/c5555558-89a8-4faa-aeb3-0ee1110796be-kube-api-access-5lxt8\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.267344 4756 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.267354 4756 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.267363 4756 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.267372 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.267381 4756 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5555558-89a8-4faa-aeb3-0ee1110796be-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.371474 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" event={"ID":"c5555558-89a8-4faa-aeb3-0ee1110796be","Type":"ContainerDied","Data":"c3fb57ac4f51976c83b5d88d9c7935a6d00440b1028f4840af1574694af30eb8"} Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.371530 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c" Mar 18 14:46:27 crc kubenswrapper[4756]: I0318 14:46:27.371539 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3fb57ac4f51976c83b5d88d9c7935a6d00440b1028f4840af1574694af30eb8" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.374748 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zph2b"] Mar 18 14:46:29 crc kubenswrapper[4756]: E0318 14:46:29.375592 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5555558-89a8-4faa-aeb3-0ee1110796be" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.375611 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5555558-89a8-4faa-aeb3-0ee1110796be" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.375892 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5555558-89a8-4faa-aeb3-0ee1110796be" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.377599 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.392419 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zph2b"] Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.414485 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmx7\" (UniqueName: \"kubernetes.io/projected/6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e-kube-api-access-vhmx7\") pod \"community-operators-zph2b\" (UID: \"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e\") " pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.414838 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e-catalog-content\") pod \"community-operators-zph2b\" (UID: \"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e\") " pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.415139 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e-utilities\") pod \"community-operators-zph2b\" (UID: \"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e\") " pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.517940 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmx7\" (UniqueName: \"kubernetes.io/projected/6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e-kube-api-access-vhmx7\") pod \"community-operators-zph2b\" (UID: \"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e\") " pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.518339 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e-catalog-content\") pod \"community-operators-zph2b\" (UID: \"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e\") " pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.518455 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e-utilities\") pod \"community-operators-zph2b\" (UID: \"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e\") " pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.518726 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e-catalog-content\") pod \"community-operators-zph2b\" (UID: \"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e\") " pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.518969 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e-utilities\") pod \"community-operators-zph2b\" (UID: \"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e\") " pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.535641 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmx7\" (UniqueName: \"kubernetes.io/projected/6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e-kube-api-access-vhmx7\") pod \"community-operators-zph2b\" (UID: \"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e\") " pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:29 crc kubenswrapper[4756]: I0318 14:46:29.700581 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:30 crc kubenswrapper[4756]: I0318 14:46:30.202400 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zph2b"] Mar 18 14:46:30 crc kubenswrapper[4756]: I0318 14:46:30.396461 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zph2b" event={"ID":"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e","Type":"ContainerStarted","Data":"4a9f0117d033baae86ad1e8871feceda3576ab9aab28da8873b5e9a4a2906df3"} Mar 18 14:46:31 crc kubenswrapper[4756]: I0318 14:46:31.417152 4756 generic.go:334] "Generic (PLEG): container finished" podID="6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e" containerID="06d1f93762860fcdf8f4ab8f72fee73b0ab9727a62d7546dd78779e8a0b24198" exitCode=0 Mar 18 14:46:31 crc kubenswrapper[4756]: I0318 14:46:31.417990 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zph2b" event={"ID":"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e","Type":"ContainerDied","Data":"06d1f93762860fcdf8f4ab8f72fee73b0ab9727a62d7546dd78779e8a0b24198"} Mar 18 14:46:32 crc kubenswrapper[4756]: I0318 14:46:32.316213 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:46:32 crc kubenswrapper[4756]: E0318 14:46:32.317040 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:46:33 crc kubenswrapper[4756]: I0318 14:46:33.472815 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:33 crc kubenswrapper[4756]: I0318 14:46:33.522867 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:33 crc kubenswrapper[4756]: I0318 14:46:33.750099 4756 scope.go:117] "RemoveContainer" containerID="524378a1c140443e169efea7ab05b80aed2df96b7a5731023c4773af7ad641f3" Mar 18 14:46:34 crc kubenswrapper[4756]: I0318 14:46:34.721665 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j7rkt"] Mar 18 14:46:35 crc kubenswrapper[4756]: I0318 14:46:35.470537 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j7rkt" podUID="0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" containerName="registry-server" containerID="cri-o://ea3dbf9ea57d1bf9191f6e4469df781274267857c88cfb5adec4c5957cc602ab" gracePeriod=2 Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.493876 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zph2b" event={"ID":"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e","Type":"ContainerStarted","Data":"eaf156e75dd4de4ebbedf8b8eb6faf38ededed0f32840e410d9af7ec6acfb8a6"} Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.496199 4756 generic.go:334] "Generic (PLEG): container finished" podID="0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" containerID="ea3dbf9ea57d1bf9191f6e4469df781274267857c88cfb5adec4c5957cc602ab" exitCode=0 Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.496235 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7rkt" event={"ID":"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884","Type":"ContainerDied","Data":"ea3dbf9ea57d1bf9191f6e4469df781274267857c88cfb5adec4c5957cc602ab"} Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.496253 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7rkt" event={"ID":"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884","Type":"ContainerDied","Data":"05b4d351eac7cc1db93ff5bbc1112f7f560032f8d43a02efb123ba1153b77045"} Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.496266 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05b4d351eac7cc1db93ff5bbc1112f7f560032f8d43a02efb123ba1153b77045" Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.501964 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.598793 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg4ds\" (UniqueName: \"kubernetes.io/projected/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-kube-api-access-qg4ds\") pod \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\" (UID: \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\") " Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.598986 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-utilities\") pod \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\" (UID: \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\") " Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.599085 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-catalog-content\") pod \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\" (UID: \"0bd25cfa-56aa-4d6a-92b8-f6d4358a3884\") " Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.600071 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-utilities" (OuterVolumeSpecName: "utilities") pod "0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" (UID: "0bd25cfa-56aa-4d6a-92b8-f6d4358a3884"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.606673 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-kube-api-access-qg4ds" (OuterVolumeSpecName: "kube-api-access-qg4ds") pod "0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" (UID: "0bd25cfa-56aa-4d6a-92b8-f6d4358a3884"). InnerVolumeSpecName "kube-api-access-qg4ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.702588 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.702621 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg4ds\" (UniqueName: \"kubernetes.io/projected/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-kube-api-access-qg4ds\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.733802 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" (UID: "0bd25cfa-56aa-4d6a-92b8-f6d4358a3884"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:46:37 crc kubenswrapper[4756]: I0318 14:46:37.804427 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:38 crc kubenswrapper[4756]: I0318 14:46:38.513169 4756 generic.go:334] "Generic (PLEG): container finished" podID="6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e" containerID="eaf156e75dd4de4ebbedf8b8eb6faf38ededed0f32840e410d9af7ec6acfb8a6" exitCode=0 Mar 18 14:46:38 crc kubenswrapper[4756]: I0318 14:46:38.513250 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zph2b" event={"ID":"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e","Type":"ContainerDied","Data":"eaf156e75dd4de4ebbedf8b8eb6faf38ededed0f32840e410d9af7ec6acfb8a6"} Mar 18 14:46:38 crc kubenswrapper[4756]: I0318 14:46:38.513870 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7rkt" Mar 18 14:46:38 crc kubenswrapper[4756]: I0318 14:46:38.576563 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j7rkt"] Mar 18 14:46:38 crc kubenswrapper[4756]: I0318 14:46:38.585642 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j7rkt"] Mar 18 14:46:39 crc kubenswrapper[4756]: I0318 14:46:39.333269 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" path="/var/lib/kubelet/pods/0bd25cfa-56aa-4d6a-92b8-f6d4358a3884/volumes" Mar 18 14:46:39 crc kubenswrapper[4756]: I0318 14:46:39.528982 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zph2b" event={"ID":"6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e","Type":"ContainerStarted","Data":"9b9c8a241c0b18c5bfaf95dace2cbca331bb5f7e682a4c945bed4cbe0ffec804"} Mar 18 14:46:39 crc kubenswrapper[4756]: I0318 14:46:39.553697 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zph2b" podStartSLOduration=2.9956838709999998 podStartE2EDuration="10.553674429s" podCreationTimestamp="2026-03-18 14:46:29 +0000 UTC" firstStartedPulling="2026-03-18 14:46:31.420243309 +0000 UTC m=+2792.734661324" lastFinishedPulling="2026-03-18 14:46:38.978233907 +0000 UTC m=+2800.292651882" observedRunningTime="2026-03-18 14:46:39.552999511 +0000 UTC m=+2800.867417626" watchObservedRunningTime="2026-03-18 14:46:39.553674429 +0000 UTC m=+2800.868092404" Mar 18 14:46:39 crc kubenswrapper[4756]: I0318 14:46:39.701359 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:39 crc kubenswrapper[4756]: I0318 14:46:39.701420 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:40 crc kubenswrapper[4756]: I0318 14:46:40.748811 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-zph2b" podUID="6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e" containerName="registry-server" probeResult="failure" output=< Mar 18 14:46:40 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:46:40 crc kubenswrapper[4756]: > Mar 18 14:46:43 crc kubenswrapper[4756]: I0318 14:46:43.315846 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:46:43 crc kubenswrapper[4756]: I0318 14:46:43.577351 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"c541bb14ab625c993df67e9d02820948c3b385ef7f0e95f5dad40eb319141136"} Mar 18 14:46:49 crc kubenswrapper[4756]: I0318 14:46:49.814786 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:49 crc kubenswrapper[4756]: I0318 14:46:49.906663 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zph2b" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.028440 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zph2b"] Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.106441 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pb94g"] Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.107006 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pb94g" podUID="25847598-c3f0-419a-b422-6b35f9f71311" containerName="registry-server" containerID="cri-o://e6730ec25cdadc0b0f00e8b0c212a52914b1f2c2fafa1eaee80a7d915faa080c" gracePeriod=2 Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.659556 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.670351 4756 generic.go:334] "Generic (PLEG): container finished" podID="25847598-c3f0-419a-b422-6b35f9f71311" containerID="e6730ec25cdadc0b0f00e8b0c212a52914b1f2c2fafa1eaee80a7d915faa080c" exitCode=0 Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.670409 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb94g" event={"ID":"25847598-c3f0-419a-b422-6b35f9f71311","Type":"ContainerDied","Data":"e6730ec25cdadc0b0f00e8b0c212a52914b1f2c2fafa1eaee80a7d915faa080c"} Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.670441 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb94g" event={"ID":"25847598-c3f0-419a-b422-6b35f9f71311","Type":"ContainerDied","Data":"acda8be0354531366e2f4e336cec135b6c0eaaf411bf654ba0e83abeb13e7cab"} Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.670461 4756 scope.go:117] "RemoveContainer" containerID="e6730ec25cdadc0b0f00e8b0c212a52914b1f2c2fafa1eaee80a7d915faa080c" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.670412 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb94g" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.691815 4756 scope.go:117] "RemoveContainer" containerID="bdc7db3f24bab4604e6a793cf5b8300af8e4b5d2e9f3779c5c86a2e3306f159a" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.724606 4756 scope.go:117] "RemoveContainer" containerID="f2685483d1a07d055837cfb91685fd601760475b196eb8fc7987ea6e91403b42" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.769646 4756 scope.go:117] "RemoveContainer" containerID="e6730ec25cdadc0b0f00e8b0c212a52914b1f2c2fafa1eaee80a7d915faa080c" Mar 18 14:46:50 crc kubenswrapper[4756]: E0318 14:46:50.774525 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6730ec25cdadc0b0f00e8b0c212a52914b1f2c2fafa1eaee80a7d915faa080c\": container with ID starting with e6730ec25cdadc0b0f00e8b0c212a52914b1f2c2fafa1eaee80a7d915faa080c not found: ID does not exist" containerID="e6730ec25cdadc0b0f00e8b0c212a52914b1f2c2fafa1eaee80a7d915faa080c" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.774564 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6730ec25cdadc0b0f00e8b0c212a52914b1f2c2fafa1eaee80a7d915faa080c"} err="failed to get container status \"e6730ec25cdadc0b0f00e8b0c212a52914b1f2c2fafa1eaee80a7d915faa080c\": rpc error: code = NotFound desc = could not find container \"e6730ec25cdadc0b0f00e8b0c212a52914b1f2c2fafa1eaee80a7d915faa080c\": container with ID starting with e6730ec25cdadc0b0f00e8b0c212a52914b1f2c2fafa1eaee80a7d915faa080c not found: ID does not exist" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.774585 4756 scope.go:117] "RemoveContainer" containerID="bdc7db3f24bab4604e6a793cf5b8300af8e4b5d2e9f3779c5c86a2e3306f159a" Mar 18 14:46:50 crc kubenswrapper[4756]: E0318 14:46:50.774860 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdc7db3f24bab4604e6a793cf5b8300af8e4b5d2e9f3779c5c86a2e3306f159a\": container with ID starting with bdc7db3f24bab4604e6a793cf5b8300af8e4b5d2e9f3779c5c86a2e3306f159a not found: ID does not exist" containerID="bdc7db3f24bab4604e6a793cf5b8300af8e4b5d2e9f3779c5c86a2e3306f159a" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.774896 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdc7db3f24bab4604e6a793cf5b8300af8e4b5d2e9f3779c5c86a2e3306f159a"} err="failed to get container status \"bdc7db3f24bab4604e6a793cf5b8300af8e4b5d2e9f3779c5c86a2e3306f159a\": rpc error: code = NotFound desc = could not find container \"bdc7db3f24bab4604e6a793cf5b8300af8e4b5d2e9f3779c5c86a2e3306f159a\": container with ID starting with bdc7db3f24bab4604e6a793cf5b8300af8e4b5d2e9f3779c5c86a2e3306f159a not found: ID does not exist" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.774923 4756 scope.go:117] "RemoveContainer" containerID="f2685483d1a07d055837cfb91685fd601760475b196eb8fc7987ea6e91403b42" Mar 18 14:46:50 crc kubenswrapper[4756]: E0318 14:46:50.775165 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2685483d1a07d055837cfb91685fd601760475b196eb8fc7987ea6e91403b42\": container with ID starting with f2685483d1a07d055837cfb91685fd601760475b196eb8fc7987ea6e91403b42 not found: ID does not exist" containerID="f2685483d1a07d055837cfb91685fd601760475b196eb8fc7987ea6e91403b42" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.775186 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2685483d1a07d055837cfb91685fd601760475b196eb8fc7987ea6e91403b42"} err="failed to get container status \"f2685483d1a07d055837cfb91685fd601760475b196eb8fc7987ea6e91403b42\": rpc error: code = NotFound desc = could not find container \"f2685483d1a07d055837cfb91685fd601760475b196eb8fc7987ea6e91403b42\": container with ID starting with f2685483d1a07d055837cfb91685fd601760475b196eb8fc7987ea6e91403b42 not found: ID does not exist" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.839428 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25847598-c3f0-419a-b422-6b35f9f71311-utilities\") pod \"25847598-c3f0-419a-b422-6b35f9f71311\" (UID: \"25847598-c3f0-419a-b422-6b35f9f71311\") " Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.839515 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25847598-c3f0-419a-b422-6b35f9f71311-catalog-content\") pod \"25847598-c3f0-419a-b422-6b35f9f71311\" (UID: \"25847598-c3f0-419a-b422-6b35f9f71311\") " Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.839753 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nffpw\" (UniqueName: \"kubernetes.io/projected/25847598-c3f0-419a-b422-6b35f9f71311-kube-api-access-nffpw\") pod \"25847598-c3f0-419a-b422-6b35f9f71311\" (UID: \"25847598-c3f0-419a-b422-6b35f9f71311\") " Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.847003 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25847598-c3f0-419a-b422-6b35f9f71311-utilities" (OuterVolumeSpecName: "utilities") pod "25847598-c3f0-419a-b422-6b35f9f71311" (UID: "25847598-c3f0-419a-b422-6b35f9f71311"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.849451 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25847598-c3f0-419a-b422-6b35f9f71311-kube-api-access-nffpw" (OuterVolumeSpecName: "kube-api-access-nffpw") pod "25847598-c3f0-419a-b422-6b35f9f71311" (UID: "25847598-c3f0-419a-b422-6b35f9f71311"). InnerVolumeSpecName "kube-api-access-nffpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.928406 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25847598-c3f0-419a-b422-6b35f9f71311-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25847598-c3f0-419a-b422-6b35f9f71311" (UID: "25847598-c3f0-419a-b422-6b35f9f71311"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.942330 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nffpw\" (UniqueName: \"kubernetes.io/projected/25847598-c3f0-419a-b422-6b35f9f71311-kube-api-access-nffpw\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.942357 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25847598-c3f0-419a-b422-6b35f9f71311-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:50 crc kubenswrapper[4756]: I0318 14:46:50.942366 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25847598-c3f0-419a-b422-6b35f9f71311-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:51 crc kubenswrapper[4756]: I0318 14:46:51.005362 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pb94g"] Mar 18 14:46:51 crc kubenswrapper[4756]: I0318 14:46:51.013806 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pb94g"] Mar 18 14:46:51 crc kubenswrapper[4756]: I0318 14:46:51.325701 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25847598-c3f0-419a-b422-6b35f9f71311" path="/var/lib/kubelet/pods/25847598-c3f0-419a-b422-6b35f9f71311/volumes" Mar 18 14:46:56 crc kubenswrapper[4756]: E0318 14:46:56.516301 4756 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.34:55734->38.129.56.34:39757: write tcp 38.129.56.34:55734->38.129.56.34:39757: write: broken pipe Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.931806 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 14:47:50 crc kubenswrapper[4756]: E0318 14:47:50.932987 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25847598-c3f0-419a-b422-6b35f9f71311" containerName="extract-utilities" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.933005 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="25847598-c3f0-419a-b422-6b35f9f71311" containerName="extract-utilities" Mar 18 14:47:50 crc kubenswrapper[4756]: E0318 14:47:50.933026 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" containerName="extract-content" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.933035 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" containerName="extract-content" Mar 18 14:47:50 crc kubenswrapper[4756]: E0318 14:47:50.933051 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" containerName="registry-server" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.933059 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" containerName="registry-server" Mar 18 14:47:50 crc kubenswrapper[4756]: E0318 14:47:50.933092 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" containerName="extract-utilities" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.933100 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" containerName="extract-utilities" Mar 18 14:47:50 crc kubenswrapper[4756]: E0318 14:47:50.933146 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25847598-c3f0-419a-b422-6b35f9f71311" containerName="extract-content" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.933158 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="25847598-c3f0-419a-b422-6b35f9f71311" containerName="extract-content" Mar 18 14:47:50 crc kubenswrapper[4756]: E0318 14:47:50.933183 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25847598-c3f0-419a-b422-6b35f9f71311" containerName="registry-server" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.933191 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="25847598-c3f0-419a-b422-6b35f9f71311" containerName="registry-server" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.933447 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="25847598-c3f0-419a-b422-6b35f9f71311" containerName="registry-server" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.933462 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bd25cfa-56aa-4d6a-92b8-f6d4358a3884" containerName="registry-server" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.934511 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.938942 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.939003 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-b452p" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.939828 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.939989 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 18 14:47:50 crc kubenswrapper[4756]: I0318 14:47:50.941514 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.060441 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e78c973-235a-4dcb-927f-7a5d35e786cc-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.060785 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e78c973-235a-4dcb-927f-7a5d35e786cc-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.060997 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.061086 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.061206 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.061385 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e78c973-235a-4dcb-927f-7a5d35e786cc-config-data\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.061510 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtttm\" (UniqueName: \"kubernetes.io/projected/5e78c973-235a-4dcb-927f-7a5d35e786cc-kube-api-access-jtttm\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.061669 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.061930 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e78c973-235a-4dcb-927f-7a5d35e786cc-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.164657 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.164938 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.165058 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.165226 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e78c973-235a-4dcb-927f-7a5d35e786cc-config-data\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.165441 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtttm\" (UniqueName: \"kubernetes.io/projected/5e78c973-235a-4dcb-927f-7a5d35e786cc-kube-api-access-jtttm\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.165561 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.166971 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.167062 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e78c973-235a-4dcb-927f-7a5d35e786cc-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.167241 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e78c973-235a-4dcb-927f-7a5d35e786cc-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.167347 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e78c973-235a-4dcb-927f-7a5d35e786cc-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.168356 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e78c973-235a-4dcb-927f-7a5d35e786cc-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.168849 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e78c973-235a-4dcb-927f-7a5d35e786cc-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.169815 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e78c973-235a-4dcb-927f-7a5d35e786cc-config-data\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.170005 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e78c973-235a-4dcb-927f-7a5d35e786cc-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.172287 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.173497 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.173913 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.194188 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtttm\" (UniqueName: \"kubernetes.io/projected/5e78c973-235a-4dcb-927f-7a5d35e786cc-kube-api-access-jtttm\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.207556 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.268387 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.716467 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 14:47:51 crc kubenswrapper[4756]: I0318 14:47:51.727379 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:47:52 crc kubenswrapper[4756]: I0318 14:47:52.347078 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5e78c973-235a-4dcb-927f-7a5d35e786cc","Type":"ContainerStarted","Data":"1116aaa9af7774610bd4a59753895081e68c1369906172eed2323404ff9d6276"} Mar 18 14:48:00 crc kubenswrapper[4756]: I0318 14:48:00.134884 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564088-2dmgh"] Mar 18 14:48:00 crc kubenswrapper[4756]: I0318 14:48:00.136929 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564088-2dmgh" Mar 18 14:48:00 crc kubenswrapper[4756]: I0318 14:48:00.139393 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:48:00 crc kubenswrapper[4756]: I0318 14:48:00.140191 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:48:00 crc kubenswrapper[4756]: I0318 14:48:00.140382 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:48:00 crc kubenswrapper[4756]: I0318 14:48:00.145674 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564088-2dmgh"] Mar 18 14:48:00 crc kubenswrapper[4756]: I0318 14:48:00.287256 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9npv\" (UniqueName: \"kubernetes.io/projected/a5e80e2f-abcf-4daa-84dd-bc3d67b763d9-kube-api-access-l9npv\") pod \"auto-csr-approver-29564088-2dmgh\" (UID: \"a5e80e2f-abcf-4daa-84dd-bc3d67b763d9\") " pod="openshift-infra/auto-csr-approver-29564088-2dmgh" Mar 18 14:48:00 crc kubenswrapper[4756]: I0318 14:48:00.389423 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9npv\" (UniqueName: \"kubernetes.io/projected/a5e80e2f-abcf-4daa-84dd-bc3d67b763d9-kube-api-access-l9npv\") pod \"auto-csr-approver-29564088-2dmgh\" (UID: \"a5e80e2f-abcf-4daa-84dd-bc3d67b763d9\") " pod="openshift-infra/auto-csr-approver-29564088-2dmgh" Mar 18 14:48:00 crc kubenswrapper[4756]: I0318 14:48:00.418449 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9npv\" (UniqueName: \"kubernetes.io/projected/a5e80e2f-abcf-4daa-84dd-bc3d67b763d9-kube-api-access-l9npv\") pod \"auto-csr-approver-29564088-2dmgh\" (UID: \"a5e80e2f-abcf-4daa-84dd-bc3d67b763d9\") " pod="openshift-infra/auto-csr-approver-29564088-2dmgh" Mar 18 14:48:00 crc kubenswrapper[4756]: I0318 14:48:00.455698 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564088-2dmgh" Mar 18 14:48:02 crc kubenswrapper[4756]: I0318 14:48:02.307321 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564088-2dmgh"] Mar 18 14:48:20 crc kubenswrapper[4756]: E0318 14:48:20.518963 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 18 14:48:20 crc kubenswrapper[4756]: E0318 14:48:20.519678 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jtttm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(5e78c973-235a-4dcb-927f-7a5d35e786cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 14:48:20 crc kubenswrapper[4756]: E0318 14:48:20.521214 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="5e78c973-235a-4dcb-927f-7a5d35e786cc" Mar 18 14:48:20 crc kubenswrapper[4756]: I0318 14:48:20.675583 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564088-2dmgh" event={"ID":"a5e80e2f-abcf-4daa-84dd-bc3d67b763d9","Type":"ContainerStarted","Data":"b7fadd59626f9fad7689594126254c68e38cb3bc48bcaa61985f1fb1c536f38e"} Mar 18 14:48:20 crc kubenswrapper[4756]: E0318 14:48:20.677300 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="5e78c973-235a-4dcb-927f-7a5d35e786cc" Mar 18 14:48:22 crc kubenswrapper[4756]: I0318 14:48:22.700318 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564088-2dmgh" event={"ID":"a5e80e2f-abcf-4daa-84dd-bc3d67b763d9","Type":"ContainerStarted","Data":"9446f84dc26e74a6c1c1b795ce02f223cbed2c5988315b5a12edd0d0481da9c3"} Mar 18 14:48:22 crc kubenswrapper[4756]: I0318 14:48:22.724359 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564088-2dmgh" podStartSLOduration=21.146226936 podStartE2EDuration="22.72433382s" podCreationTimestamp="2026-03-18 14:48:00 +0000 UTC" firstStartedPulling="2026-03-18 14:48:20.401078809 +0000 UTC m=+2901.715496804" lastFinishedPulling="2026-03-18 14:48:21.979185703 +0000 UTC m=+2903.293603688" observedRunningTime="2026-03-18 14:48:22.718938524 +0000 UTC m=+2904.033356539" watchObservedRunningTime="2026-03-18 14:48:22.72433382 +0000 UTC m=+2904.038751825" Mar 18 14:48:23 crc kubenswrapper[4756]: I0318 14:48:23.714213 4756 generic.go:334] "Generic (PLEG): container finished" podID="a5e80e2f-abcf-4daa-84dd-bc3d67b763d9" containerID="9446f84dc26e74a6c1c1b795ce02f223cbed2c5988315b5a12edd0d0481da9c3" exitCode=0 Mar 18 14:48:23 crc kubenswrapper[4756]: I0318 14:48:23.714289 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564088-2dmgh" event={"ID":"a5e80e2f-abcf-4daa-84dd-bc3d67b763d9","Type":"ContainerDied","Data":"9446f84dc26e74a6c1c1b795ce02f223cbed2c5988315b5a12edd0d0481da9c3"} Mar 18 14:48:25 crc kubenswrapper[4756]: I0318 14:48:25.302844 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564088-2dmgh" Mar 18 14:48:25 crc kubenswrapper[4756]: I0318 14:48:25.413752 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9npv\" (UniqueName: \"kubernetes.io/projected/a5e80e2f-abcf-4daa-84dd-bc3d67b763d9-kube-api-access-l9npv\") pod \"a5e80e2f-abcf-4daa-84dd-bc3d67b763d9\" (UID: \"a5e80e2f-abcf-4daa-84dd-bc3d67b763d9\") " Mar 18 14:48:25 crc kubenswrapper[4756]: I0318 14:48:25.434023 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e80e2f-abcf-4daa-84dd-bc3d67b763d9-kube-api-access-l9npv" (OuterVolumeSpecName: "kube-api-access-l9npv") pod "a5e80e2f-abcf-4daa-84dd-bc3d67b763d9" (UID: "a5e80e2f-abcf-4daa-84dd-bc3d67b763d9"). InnerVolumeSpecName "kube-api-access-l9npv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:48:25 crc kubenswrapper[4756]: I0318 14:48:25.517146 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9npv\" (UniqueName: \"kubernetes.io/projected/a5e80e2f-abcf-4daa-84dd-bc3d67b763d9-kube-api-access-l9npv\") on node \"crc\" DevicePath \"\"" Mar 18 14:48:25 crc kubenswrapper[4756]: I0318 14:48:25.743816 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564088-2dmgh" event={"ID":"a5e80e2f-abcf-4daa-84dd-bc3d67b763d9","Type":"ContainerDied","Data":"b7fadd59626f9fad7689594126254c68e38cb3bc48bcaa61985f1fb1c536f38e"} Mar 18 14:48:25 crc kubenswrapper[4756]: I0318 14:48:25.743891 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7fadd59626f9fad7689594126254c68e38cb3bc48bcaa61985f1fb1c536f38e" Mar 18 14:48:25 crc kubenswrapper[4756]: I0318 14:48:25.743980 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564088-2dmgh" Mar 18 14:48:25 crc kubenswrapper[4756]: I0318 14:48:25.818970 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564082-njvsw"] Mar 18 14:48:25 crc kubenswrapper[4756]: I0318 14:48:25.831285 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564082-njvsw"] Mar 18 14:48:27 crc kubenswrapper[4756]: I0318 14:48:27.326895 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="140be45f-1eab-4833-b001-7e7aec24337a" path="/var/lib/kubelet/pods/140be45f-1eab-4833-b001-7e7aec24337a/volumes" Mar 18 14:48:33 crc kubenswrapper[4756]: I0318 14:48:33.902795 4756 scope.go:117] "RemoveContainer" containerID="585ca17760858d576feb1fc837f5f8baf35f4e9d3ff6fe90858ac7322134412a" Mar 18 14:48:34 crc kubenswrapper[4756]: I0318 14:48:34.613472 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 18 14:48:36 crc kubenswrapper[4756]: I0318 14:48:36.882413 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5e78c973-235a-4dcb-927f-7a5d35e786cc","Type":"ContainerStarted","Data":"f7940a5bac3695fdcd7611031f9cc8cfe7f18b40ca9ead9e69a01907b9dffd5e"} Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.772440 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=9.889960853 podStartE2EDuration="52.772417209s" podCreationTimestamp="2026-03-18 14:47:49 +0000 UTC" firstStartedPulling="2026-03-18 14:47:51.72718764 +0000 UTC m=+2873.041605615" lastFinishedPulling="2026-03-18 14:48:34.609643956 +0000 UTC m=+2915.924061971" observedRunningTime="2026-03-18 14:48:36.90897753 +0000 UTC m=+2918.223395525" watchObservedRunningTime="2026-03-18 14:48:41.772417209 +0000 UTC m=+2923.086835204" Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.784332 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4895s"] Mar 18 14:48:41 crc kubenswrapper[4756]: E0318 14:48:41.784981 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e80e2f-abcf-4daa-84dd-bc3d67b763d9" containerName="oc" Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.784999 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e80e2f-abcf-4daa-84dd-bc3d67b763d9" containerName="oc" Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.785296 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e80e2f-abcf-4daa-84dd-bc3d67b763d9" containerName="oc" Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.786765 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.795702 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4895s"] Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.836465 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f92eb80-f121-4d5f-9903-870957880e47-utilities\") pod \"redhat-marketplace-4895s\" (UID: \"1f92eb80-f121-4d5f-9903-870957880e47\") " pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.836574 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjf56\" (UniqueName: \"kubernetes.io/projected/1f92eb80-f121-4d5f-9903-870957880e47-kube-api-access-jjf56\") pod \"redhat-marketplace-4895s\" (UID: \"1f92eb80-f121-4d5f-9903-870957880e47\") " pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.836620 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f92eb80-f121-4d5f-9903-870957880e47-catalog-content\") pod \"redhat-marketplace-4895s\" (UID: \"1f92eb80-f121-4d5f-9903-870957880e47\") " pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.964809 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f92eb80-f121-4d5f-9903-870957880e47-utilities\") pod \"redhat-marketplace-4895s\" (UID: \"1f92eb80-f121-4d5f-9903-870957880e47\") " pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.965911 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjf56\" (UniqueName: \"kubernetes.io/projected/1f92eb80-f121-4d5f-9903-870957880e47-kube-api-access-jjf56\") pod \"redhat-marketplace-4895s\" (UID: \"1f92eb80-f121-4d5f-9903-870957880e47\") " pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.965979 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f92eb80-f121-4d5f-9903-870957880e47-utilities\") pod \"redhat-marketplace-4895s\" (UID: \"1f92eb80-f121-4d5f-9903-870957880e47\") " pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.966095 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f92eb80-f121-4d5f-9903-870957880e47-catalog-content\") pod \"redhat-marketplace-4895s\" (UID: \"1f92eb80-f121-4d5f-9903-870957880e47\") " pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.966487 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f92eb80-f121-4d5f-9903-870957880e47-catalog-content\") pod \"redhat-marketplace-4895s\" (UID: \"1f92eb80-f121-4d5f-9903-870957880e47\") " pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:41 crc kubenswrapper[4756]: I0318 14:48:41.987347 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjf56\" (UniqueName: \"kubernetes.io/projected/1f92eb80-f121-4d5f-9903-870957880e47-kube-api-access-jjf56\") pod \"redhat-marketplace-4895s\" (UID: \"1f92eb80-f121-4d5f-9903-870957880e47\") " pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:42 crc kubenswrapper[4756]: I0318 14:48:42.107979 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:42 crc kubenswrapper[4756]: I0318 14:48:42.734434 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4895s"] Mar 18 14:48:42 crc kubenswrapper[4756]: W0318 14:48:42.744539 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f92eb80_f121_4d5f_9903_870957880e47.slice/crio-077d1fcac5d1c70666637fabbe26a2195f7ef50df43a5d67ac784167c6371d60 WatchSource:0}: Error finding container 077d1fcac5d1c70666637fabbe26a2195f7ef50df43a5d67ac784167c6371d60: Status 404 returned error can't find the container with id 077d1fcac5d1c70666637fabbe26a2195f7ef50df43a5d67ac784167c6371d60 Mar 18 14:48:42 crc kubenswrapper[4756]: I0318 14:48:42.969725 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4895s" event={"ID":"1f92eb80-f121-4d5f-9903-870957880e47","Type":"ContainerStarted","Data":"077d1fcac5d1c70666637fabbe26a2195f7ef50df43a5d67ac784167c6371d60"} Mar 18 14:48:43 crc kubenswrapper[4756]: I0318 14:48:43.986263 4756 generic.go:334] "Generic (PLEG): container finished" podID="1f92eb80-f121-4d5f-9903-870957880e47" containerID="aa953c1338f4ad2f2bdb2f2ddf2d9933c05de93ae62a145f972dbdbf97bd92d3" exitCode=0 Mar 18 14:48:43 crc kubenswrapper[4756]: I0318 14:48:43.986499 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4895s" event={"ID":"1f92eb80-f121-4d5f-9903-870957880e47","Type":"ContainerDied","Data":"aa953c1338f4ad2f2bdb2f2ddf2d9933c05de93ae62a145f972dbdbf97bd92d3"} Mar 18 14:48:47 crc kubenswrapper[4756]: I0318 14:48:47.035593 4756 generic.go:334] "Generic (PLEG): container finished" podID="1f92eb80-f121-4d5f-9903-870957880e47" containerID="deec0e6c81efaad9c0c9f6689555a96cc3a921aeea0a92ad006352b828e00266" exitCode=0 Mar 18 14:48:47 crc kubenswrapper[4756]: I0318 14:48:47.035843 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4895s" event={"ID":"1f92eb80-f121-4d5f-9903-870957880e47","Type":"ContainerDied","Data":"deec0e6c81efaad9c0c9f6689555a96cc3a921aeea0a92ad006352b828e00266"} Mar 18 14:48:49 crc kubenswrapper[4756]: I0318 14:48:49.080415 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4895s" event={"ID":"1f92eb80-f121-4d5f-9903-870957880e47","Type":"ContainerStarted","Data":"7bb52212cb3ec839af64359b24aed9cb2eaa4f6101282718738c830d8e1cf41d"} Mar 18 14:48:49 crc kubenswrapper[4756]: I0318 14:48:49.114010 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4895s" podStartSLOduration=4.164516028 podStartE2EDuration="8.113989762s" podCreationTimestamp="2026-03-18 14:48:41 +0000 UTC" firstStartedPulling="2026-03-18 14:48:43.98979408 +0000 UTC m=+2925.304212095" lastFinishedPulling="2026-03-18 14:48:47.939267814 +0000 UTC m=+2929.253685829" observedRunningTime="2026-03-18 14:48:49.102009119 +0000 UTC m=+2930.416427104" watchObservedRunningTime="2026-03-18 14:48:49.113989762 +0000 UTC m=+2930.428407737" Mar 18 14:48:52 crc kubenswrapper[4756]: I0318 14:48:52.108820 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:52 crc kubenswrapper[4756]: I0318 14:48:52.109714 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:52 crc kubenswrapper[4756]: I0318 14:48:52.174322 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:53 crc kubenswrapper[4756]: I0318 14:48:53.200915 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:53 crc kubenswrapper[4756]: I0318 14:48:53.264692 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4895s"] Mar 18 14:48:55 crc kubenswrapper[4756]: I0318 14:48:55.144955 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4895s" podUID="1f92eb80-f121-4d5f-9903-870957880e47" containerName="registry-server" containerID="cri-o://7bb52212cb3ec839af64359b24aed9cb2eaa4f6101282718738c830d8e1cf41d" gracePeriod=2 Mar 18 14:48:55 crc kubenswrapper[4756]: I0318 14:48:55.720812 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:55 crc kubenswrapper[4756]: I0318 14:48:55.915160 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f92eb80-f121-4d5f-9903-870957880e47-catalog-content\") pod \"1f92eb80-f121-4d5f-9903-870957880e47\" (UID: \"1f92eb80-f121-4d5f-9903-870957880e47\") " Mar 18 14:48:55 crc kubenswrapper[4756]: I0318 14:48:55.915434 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjf56\" (UniqueName: \"kubernetes.io/projected/1f92eb80-f121-4d5f-9903-870957880e47-kube-api-access-jjf56\") pod \"1f92eb80-f121-4d5f-9903-870957880e47\" (UID: \"1f92eb80-f121-4d5f-9903-870957880e47\") " Mar 18 14:48:55 crc kubenswrapper[4756]: I0318 14:48:55.915599 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f92eb80-f121-4d5f-9903-870957880e47-utilities\") pod \"1f92eb80-f121-4d5f-9903-870957880e47\" (UID: \"1f92eb80-f121-4d5f-9903-870957880e47\") " Mar 18 14:48:55 crc kubenswrapper[4756]: I0318 14:48:55.916315 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f92eb80-f121-4d5f-9903-870957880e47-utilities" (OuterVolumeSpecName: "utilities") pod "1f92eb80-f121-4d5f-9903-870957880e47" (UID: "1f92eb80-f121-4d5f-9903-870957880e47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:48:55 crc kubenswrapper[4756]: I0318 14:48:55.930335 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f92eb80-f121-4d5f-9903-870957880e47-kube-api-access-jjf56" (OuterVolumeSpecName: "kube-api-access-jjf56") pod "1f92eb80-f121-4d5f-9903-870957880e47" (UID: "1f92eb80-f121-4d5f-9903-870957880e47"). InnerVolumeSpecName "kube-api-access-jjf56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:48:55 crc kubenswrapper[4756]: I0318 14:48:55.942218 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f92eb80-f121-4d5f-9903-870957880e47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f92eb80-f121-4d5f-9903-870957880e47" (UID: "1f92eb80-f121-4d5f-9903-870957880e47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.018060 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f92eb80-f121-4d5f-9903-870957880e47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.018094 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjf56\" (UniqueName: \"kubernetes.io/projected/1f92eb80-f121-4d5f-9903-870957880e47-kube-api-access-jjf56\") on node \"crc\" DevicePath \"\"" Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.018106 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f92eb80-f121-4d5f-9903-870957880e47-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.155844 4756 generic.go:334] "Generic (PLEG): container finished" podID="1f92eb80-f121-4d5f-9903-870957880e47" containerID="7bb52212cb3ec839af64359b24aed9cb2eaa4f6101282718738c830d8e1cf41d" exitCode=0 Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.155857 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4895s" event={"ID":"1f92eb80-f121-4d5f-9903-870957880e47","Type":"ContainerDied","Data":"7bb52212cb3ec839af64359b24aed9cb2eaa4f6101282718738c830d8e1cf41d"} Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.155880 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4895s" Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.155964 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4895s" event={"ID":"1f92eb80-f121-4d5f-9903-870957880e47","Type":"ContainerDied","Data":"077d1fcac5d1c70666637fabbe26a2195f7ef50df43a5d67ac784167c6371d60"} Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.156018 4756 scope.go:117] "RemoveContainer" containerID="7bb52212cb3ec839af64359b24aed9cb2eaa4f6101282718738c830d8e1cf41d" Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.176560 4756 scope.go:117] "RemoveContainer" containerID="deec0e6c81efaad9c0c9f6689555a96cc3a921aeea0a92ad006352b828e00266" Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.190636 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4895s"] Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.210252 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4895s"] Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.222276 4756 scope.go:117] "RemoveContainer" containerID="aa953c1338f4ad2f2bdb2f2ddf2d9933c05de93ae62a145f972dbdbf97bd92d3" Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.266800 4756 scope.go:117] "RemoveContainer" containerID="7bb52212cb3ec839af64359b24aed9cb2eaa4f6101282718738c830d8e1cf41d" Mar 18 14:48:56 crc kubenswrapper[4756]: E0318 14:48:56.267219 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bb52212cb3ec839af64359b24aed9cb2eaa4f6101282718738c830d8e1cf41d\": container with ID starting with 7bb52212cb3ec839af64359b24aed9cb2eaa4f6101282718738c830d8e1cf41d not found: ID does not exist" containerID="7bb52212cb3ec839af64359b24aed9cb2eaa4f6101282718738c830d8e1cf41d" Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.267251 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb52212cb3ec839af64359b24aed9cb2eaa4f6101282718738c830d8e1cf41d"} err="failed to get container status \"7bb52212cb3ec839af64359b24aed9cb2eaa4f6101282718738c830d8e1cf41d\": rpc error: code = NotFound desc = could not find container \"7bb52212cb3ec839af64359b24aed9cb2eaa4f6101282718738c830d8e1cf41d\": container with ID starting with 7bb52212cb3ec839af64359b24aed9cb2eaa4f6101282718738c830d8e1cf41d not found: ID does not exist" Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.267271 4756 scope.go:117] "RemoveContainer" containerID="deec0e6c81efaad9c0c9f6689555a96cc3a921aeea0a92ad006352b828e00266" Mar 18 14:48:56 crc kubenswrapper[4756]: E0318 14:48:56.267455 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deec0e6c81efaad9c0c9f6689555a96cc3a921aeea0a92ad006352b828e00266\": container with ID starting with deec0e6c81efaad9c0c9f6689555a96cc3a921aeea0a92ad006352b828e00266 not found: ID does not exist" containerID="deec0e6c81efaad9c0c9f6689555a96cc3a921aeea0a92ad006352b828e00266" Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.267476 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deec0e6c81efaad9c0c9f6689555a96cc3a921aeea0a92ad006352b828e00266"} err="failed to get container status \"deec0e6c81efaad9c0c9f6689555a96cc3a921aeea0a92ad006352b828e00266\": rpc error: code = NotFound desc = could not find container \"deec0e6c81efaad9c0c9f6689555a96cc3a921aeea0a92ad006352b828e00266\": container with ID starting with deec0e6c81efaad9c0c9f6689555a96cc3a921aeea0a92ad006352b828e00266 not found: ID does not exist" Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.267487 4756 scope.go:117] "RemoveContainer" containerID="aa953c1338f4ad2f2bdb2f2ddf2d9933c05de93ae62a145f972dbdbf97bd92d3" Mar 18 14:48:56 crc kubenswrapper[4756]: E0318 14:48:56.270397 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa953c1338f4ad2f2bdb2f2ddf2d9933c05de93ae62a145f972dbdbf97bd92d3\": container with ID starting with aa953c1338f4ad2f2bdb2f2ddf2d9933c05de93ae62a145f972dbdbf97bd92d3 not found: ID does not exist" containerID="aa953c1338f4ad2f2bdb2f2ddf2d9933c05de93ae62a145f972dbdbf97bd92d3" Mar 18 14:48:56 crc kubenswrapper[4756]: I0318 14:48:56.270452 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa953c1338f4ad2f2bdb2f2ddf2d9933c05de93ae62a145f972dbdbf97bd92d3"} err="failed to get container status \"aa953c1338f4ad2f2bdb2f2ddf2d9933c05de93ae62a145f972dbdbf97bd92d3\": rpc error: code = NotFound desc = could not find container \"aa953c1338f4ad2f2bdb2f2ddf2d9933c05de93ae62a145f972dbdbf97bd92d3\": container with ID starting with aa953c1338f4ad2f2bdb2f2ddf2d9933c05de93ae62a145f972dbdbf97bd92d3 not found: ID does not exist" Mar 18 14:48:57 crc kubenswrapper[4756]: I0318 14:48:57.333388 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f92eb80-f121-4d5f-9903-870957880e47" path="/var/lib/kubelet/pods/1f92eb80-f121-4d5f-9903-870957880e47/volumes" Mar 18 14:49:06 crc kubenswrapper[4756]: I0318 14:49:06.915361 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:49:06 crc kubenswrapper[4756]: I0318 14:49:06.915982 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:49:36 crc kubenswrapper[4756]: I0318 14:49:36.915292 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:49:36 crc kubenswrapper[4756]: I0318 14:49:36.915785 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.166169 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564090-697tj"] Mar 18 14:50:00 crc kubenswrapper[4756]: E0318 14:50:00.167670 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f92eb80-f121-4d5f-9903-870957880e47" containerName="extract-utilities" Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.167694 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f92eb80-f121-4d5f-9903-870957880e47" containerName="extract-utilities" Mar 18 14:50:00 crc kubenswrapper[4756]: E0318 14:50:00.167729 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f92eb80-f121-4d5f-9903-870957880e47" containerName="registry-server" Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.167742 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f92eb80-f121-4d5f-9903-870957880e47" containerName="registry-server" Mar 18 14:50:00 crc kubenswrapper[4756]: E0318 14:50:00.167811 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f92eb80-f121-4d5f-9903-870957880e47" containerName="extract-content" Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.167826 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f92eb80-f121-4d5f-9903-870957880e47" containerName="extract-content" Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.168216 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f92eb80-f121-4d5f-9903-870957880e47" containerName="registry-server" Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.169843 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564090-697tj" Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.173892 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.174196 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.174298 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.182776 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564090-697tj"] Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.191628 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx9gd\" (UniqueName: \"kubernetes.io/projected/eead16c4-568f-4945-ac43-52a7138e6db8-kube-api-access-rx9gd\") pod \"auto-csr-approver-29564090-697tj\" (UID: \"eead16c4-568f-4945-ac43-52a7138e6db8\") " pod="openshift-infra/auto-csr-approver-29564090-697tj" Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.293429 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx9gd\" (UniqueName: \"kubernetes.io/projected/eead16c4-568f-4945-ac43-52a7138e6db8-kube-api-access-rx9gd\") pod \"auto-csr-approver-29564090-697tj\" (UID: \"eead16c4-568f-4945-ac43-52a7138e6db8\") " pod="openshift-infra/auto-csr-approver-29564090-697tj" Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.314753 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx9gd\" (UniqueName: \"kubernetes.io/projected/eead16c4-568f-4945-ac43-52a7138e6db8-kube-api-access-rx9gd\") pod \"auto-csr-approver-29564090-697tj\" (UID: \"eead16c4-568f-4945-ac43-52a7138e6db8\") " pod="openshift-infra/auto-csr-approver-29564090-697tj" Mar 18 14:50:00 crc kubenswrapper[4756]: I0318 14:50:00.532534 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564090-697tj" Mar 18 14:50:01 crc kubenswrapper[4756]: I0318 14:50:01.052570 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564090-697tj"] Mar 18 14:50:01 crc kubenswrapper[4756]: I0318 14:50:01.876207 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564090-697tj" event={"ID":"eead16c4-568f-4945-ac43-52a7138e6db8","Type":"ContainerStarted","Data":"9a1c6ff46bbabe951059621cf303f44ef536e7dbdcefc8d248556044a369e33c"} Mar 18 14:50:03 crc kubenswrapper[4756]: I0318 14:50:03.898870 4756 generic.go:334] "Generic (PLEG): container finished" podID="eead16c4-568f-4945-ac43-52a7138e6db8" containerID="9c7bc4985276144d5c63fcfe46a497b63a10463e6ebc79f6cd96f794b1f7d192" exitCode=0 Mar 18 14:50:03 crc kubenswrapper[4756]: I0318 14:50:03.899064 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564090-697tj" event={"ID":"eead16c4-568f-4945-ac43-52a7138e6db8","Type":"ContainerDied","Data":"9c7bc4985276144d5c63fcfe46a497b63a10463e6ebc79f6cd96f794b1f7d192"} Mar 18 14:50:05 crc kubenswrapper[4756]: I0318 14:50:05.568603 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564090-697tj" Mar 18 14:50:05 crc kubenswrapper[4756]: I0318 14:50:05.716088 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx9gd\" (UniqueName: \"kubernetes.io/projected/eead16c4-568f-4945-ac43-52a7138e6db8-kube-api-access-rx9gd\") pod \"eead16c4-568f-4945-ac43-52a7138e6db8\" (UID: \"eead16c4-568f-4945-ac43-52a7138e6db8\") " Mar 18 14:50:05 crc kubenswrapper[4756]: I0318 14:50:05.728514 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eead16c4-568f-4945-ac43-52a7138e6db8-kube-api-access-rx9gd" (OuterVolumeSpecName: "kube-api-access-rx9gd") pod "eead16c4-568f-4945-ac43-52a7138e6db8" (UID: "eead16c4-568f-4945-ac43-52a7138e6db8"). InnerVolumeSpecName "kube-api-access-rx9gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:50:05 crc kubenswrapper[4756]: I0318 14:50:05.818719 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx9gd\" (UniqueName: \"kubernetes.io/projected/eead16c4-568f-4945-ac43-52a7138e6db8-kube-api-access-rx9gd\") on node \"crc\" DevicePath \"\"" Mar 18 14:50:05 crc kubenswrapper[4756]: I0318 14:50:05.921886 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564090-697tj" event={"ID":"eead16c4-568f-4945-ac43-52a7138e6db8","Type":"ContainerDied","Data":"9a1c6ff46bbabe951059621cf303f44ef536e7dbdcefc8d248556044a369e33c"} Mar 18 14:50:05 crc kubenswrapper[4756]: I0318 14:50:05.921930 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a1c6ff46bbabe951059621cf303f44ef536e7dbdcefc8d248556044a369e33c" Mar 18 14:50:05 crc kubenswrapper[4756]: I0318 14:50:05.922014 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564090-697tj" Mar 18 14:50:06 crc kubenswrapper[4756]: I0318 14:50:06.674162 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564084-8vqbb"] Mar 18 14:50:06 crc kubenswrapper[4756]: I0318 14:50:06.689636 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564084-8vqbb"] Mar 18 14:50:06 crc kubenswrapper[4756]: I0318 14:50:06.914975 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:50:06 crc kubenswrapper[4756]: I0318 14:50:06.915071 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:50:06 crc kubenswrapper[4756]: I0318 14:50:06.915232 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:50:06 crc kubenswrapper[4756]: I0318 14:50:06.916412 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c541bb14ab625c993df67e9d02820948c3b385ef7f0e95f5dad40eb319141136"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:50:06 crc kubenswrapper[4756]: I0318 14:50:06.916555 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://c541bb14ab625c993df67e9d02820948c3b385ef7f0e95f5dad40eb319141136" gracePeriod=600 Mar 18 14:50:07 crc kubenswrapper[4756]: I0318 14:50:07.331927 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3690f204-953e-4a6d-8aa3-4f7f0fa5ff63" path="/var/lib/kubelet/pods/3690f204-953e-4a6d-8aa3-4f7f0fa5ff63/volumes" Mar 18 14:50:07 crc kubenswrapper[4756]: I0318 14:50:07.949299 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="c541bb14ab625c993df67e9d02820948c3b385ef7f0e95f5dad40eb319141136" exitCode=0 Mar 18 14:50:07 crc kubenswrapper[4756]: I0318 14:50:07.949381 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"c541bb14ab625c993df67e9d02820948c3b385ef7f0e95f5dad40eb319141136"} Mar 18 14:50:07 crc kubenswrapper[4756]: I0318 14:50:07.949452 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b"} Mar 18 14:50:07 crc kubenswrapper[4756]: I0318 14:50:07.949477 4756 scope.go:117] "RemoveContainer" containerID="e92a2b23e82d42f80651eb0a3de70b9c5c021a53c1d937bb78b09085183e425b" Mar 18 14:50:34 crc kubenswrapper[4756]: I0318 14:50:34.069523 4756 scope.go:117] "RemoveContainer" containerID="a6130cb9309217eba359a4e6de2fcfef5ce0201c410f97e19ca55f4e0b33f8e4" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.400627 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-whmx9"] Mar 18 14:50:38 crc kubenswrapper[4756]: E0318 14:50:38.401723 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eead16c4-568f-4945-ac43-52a7138e6db8" containerName="oc" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.401741 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="eead16c4-568f-4945-ac43-52a7138e6db8" containerName="oc" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.402036 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="eead16c4-568f-4945-ac43-52a7138e6db8" containerName="oc" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.403843 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.425598 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whmx9"] Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.465705 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdcj8\" (UniqueName: \"kubernetes.io/projected/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-kube-api-access-xdcj8\") pod \"certified-operators-whmx9\" (UID: \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\") " pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.465934 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-utilities\") pod \"certified-operators-whmx9\" (UID: \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\") " pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.465966 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-catalog-content\") pod \"certified-operators-whmx9\" (UID: \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\") " pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.568498 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdcj8\" (UniqueName: \"kubernetes.io/projected/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-kube-api-access-xdcj8\") pod \"certified-operators-whmx9\" (UID: \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\") " pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.568596 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-utilities\") pod \"certified-operators-whmx9\" (UID: \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\") " pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.568617 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-catalog-content\") pod \"certified-operators-whmx9\" (UID: \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\") " pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.569629 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-utilities\") pod \"certified-operators-whmx9\" (UID: \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\") " pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.569661 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-catalog-content\") pod \"certified-operators-whmx9\" (UID: \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\") " pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.594297 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdcj8\" (UniqueName: \"kubernetes.io/projected/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-kube-api-access-xdcj8\") pod \"certified-operators-whmx9\" (UID: \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\") " pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:38 crc kubenswrapper[4756]: I0318 14:50:38.740343 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:39 crc kubenswrapper[4756]: I0318 14:50:39.312421 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whmx9"] Mar 18 14:50:40 crc kubenswrapper[4756]: I0318 14:50:40.301750 4756 generic.go:334] "Generic (PLEG): container finished" podID="4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" containerID="643df94366409f508c501b1f3a4af81709c063ced3e1df4e6b1e6a84c6e33dfe" exitCode=0 Mar 18 14:50:40 crc kubenswrapper[4756]: I0318 14:50:40.301987 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whmx9" event={"ID":"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d","Type":"ContainerDied","Data":"643df94366409f508c501b1f3a4af81709c063ced3e1df4e6b1e6a84c6e33dfe"} Mar 18 14:50:40 crc kubenswrapper[4756]: I0318 14:50:40.302251 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whmx9" event={"ID":"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d","Type":"ContainerStarted","Data":"581b28e1eb0425022151f3380e54a0ed8afda870553c3c1b3df1d2d4cac5e8da"} Mar 18 14:50:42 crc kubenswrapper[4756]: I0318 14:50:42.319589 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whmx9" event={"ID":"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d","Type":"ContainerStarted","Data":"90d032c58dfe481b577d0cf654f7dd86d1debd3bbdf5056608379ca4d3b9dfb7"} Mar 18 14:50:43 crc kubenswrapper[4756]: I0318 14:50:43.330065 4756 generic.go:334] "Generic (PLEG): container finished" podID="4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" containerID="90d032c58dfe481b577d0cf654f7dd86d1debd3bbdf5056608379ca4d3b9dfb7" exitCode=0 Mar 18 14:50:43 crc kubenswrapper[4756]: I0318 14:50:43.330152 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whmx9" event={"ID":"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d","Type":"ContainerDied","Data":"90d032c58dfe481b577d0cf654f7dd86d1debd3bbdf5056608379ca4d3b9dfb7"} Mar 18 14:50:44 crc kubenswrapper[4756]: I0318 14:50:44.378225 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whmx9" event={"ID":"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d","Type":"ContainerStarted","Data":"1160dca893e205a7e1ae07baf6179a8d829e8f9981191175936b2b14882f3492"} Mar 18 14:50:44 crc kubenswrapper[4756]: I0318 14:50:44.408722 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-whmx9" podStartSLOduration=3.003501668 podStartE2EDuration="6.408690981s" podCreationTimestamp="2026-03-18 14:50:38 +0000 UTC" firstStartedPulling="2026-03-18 14:50:40.304327874 +0000 UTC m=+3041.618745849" lastFinishedPulling="2026-03-18 14:50:43.709517187 +0000 UTC m=+3045.023935162" observedRunningTime="2026-03-18 14:50:44.395957278 +0000 UTC m=+3045.710375263" watchObservedRunningTime="2026-03-18 14:50:44.408690981 +0000 UTC m=+3045.723108956" Mar 18 14:50:48 crc kubenswrapper[4756]: I0318 14:50:48.741190 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:48 crc kubenswrapper[4756]: I0318 14:50:48.741831 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:48 crc kubenswrapper[4756]: I0318 14:50:48.791862 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:49 crc kubenswrapper[4756]: I0318 14:50:49.489893 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:51 crc kubenswrapper[4756]: I0318 14:50:51.792109 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whmx9"] Mar 18 14:50:51 crc kubenswrapper[4756]: I0318 14:50:51.793010 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-whmx9" podUID="4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" containerName="registry-server" containerID="cri-o://1160dca893e205a7e1ae07baf6179a8d829e8f9981191175936b2b14882f3492" gracePeriod=2 Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.468401 4756 generic.go:334] "Generic (PLEG): container finished" podID="4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" containerID="1160dca893e205a7e1ae07baf6179a8d829e8f9981191175936b2b14882f3492" exitCode=0 Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.468727 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whmx9" event={"ID":"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d","Type":"ContainerDied","Data":"1160dca893e205a7e1ae07baf6179a8d829e8f9981191175936b2b14882f3492"} Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.468754 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whmx9" event={"ID":"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d","Type":"ContainerDied","Data":"581b28e1eb0425022151f3380e54a0ed8afda870553c3c1b3df1d2d4cac5e8da"} Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.468767 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="581b28e1eb0425022151f3380e54a0ed8afda870553c3c1b3df1d2d4cac5e8da" Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.529367 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.573626 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-catalog-content\") pod \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\" (UID: \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\") " Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.573734 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-utilities\") pod \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\" (UID: \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\") " Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.573832 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdcj8\" (UniqueName: \"kubernetes.io/projected/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-kube-api-access-xdcj8\") pod \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\" (UID: \"4d2a5abb-4f34-4c4b-9d20-c0d7f275214d\") " Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.574487 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-utilities" (OuterVolumeSpecName: "utilities") pod "4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" (UID: "4d2a5abb-4f34-4c4b-9d20-c0d7f275214d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.581307 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-kube-api-access-xdcj8" (OuterVolumeSpecName: "kube-api-access-xdcj8") pod "4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" (UID: "4d2a5abb-4f34-4c4b-9d20-c0d7f275214d"). InnerVolumeSpecName "kube-api-access-xdcj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.640506 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" (UID: "4d2a5abb-4f34-4c4b-9d20-c0d7f275214d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.676179 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdcj8\" (UniqueName: \"kubernetes.io/projected/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-kube-api-access-xdcj8\") on node \"crc\" DevicePath \"\"" Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.676215 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:50:52 crc kubenswrapper[4756]: I0318 14:50:52.676225 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:50:53 crc kubenswrapper[4756]: I0318 14:50:53.476532 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whmx9" Mar 18 14:50:53 crc kubenswrapper[4756]: I0318 14:50:53.502951 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whmx9"] Mar 18 14:50:53 crc kubenswrapper[4756]: I0318 14:50:53.514956 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-whmx9"] Mar 18 14:50:55 crc kubenswrapper[4756]: I0318 14:50:55.339637 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" path="/var/lib/kubelet/pods/4d2a5abb-4f34-4c4b-9d20-c0d7f275214d/volumes" Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.159347 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564092-vjjw9"] Mar 18 14:52:00 crc kubenswrapper[4756]: E0318 14:52:00.160466 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" containerName="extract-utilities" Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.160483 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" containerName="extract-utilities" Mar 18 14:52:00 crc kubenswrapper[4756]: E0318 14:52:00.160502 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" containerName="registry-server" Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.160510 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" containerName="registry-server" Mar 18 14:52:00 crc kubenswrapper[4756]: E0318 14:52:00.160524 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" containerName="extract-content" Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.160532 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" containerName="extract-content" Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.160766 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d2a5abb-4f34-4c4b-9d20-c0d7f275214d" containerName="registry-server" Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.161806 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564092-vjjw9" Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.165459 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.166032 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.166250 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.194448 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564092-vjjw9"] Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.244576 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46gm6\" (UniqueName: \"kubernetes.io/projected/9b3619f6-418c-4ffc-81c3-b77245662820-kube-api-access-46gm6\") pod \"auto-csr-approver-29564092-vjjw9\" (UID: \"9b3619f6-418c-4ffc-81c3-b77245662820\") " pod="openshift-infra/auto-csr-approver-29564092-vjjw9" Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.346245 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46gm6\" (UniqueName: \"kubernetes.io/projected/9b3619f6-418c-4ffc-81c3-b77245662820-kube-api-access-46gm6\") pod \"auto-csr-approver-29564092-vjjw9\" (UID: \"9b3619f6-418c-4ffc-81c3-b77245662820\") " pod="openshift-infra/auto-csr-approver-29564092-vjjw9" Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.377150 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46gm6\" (UniqueName: \"kubernetes.io/projected/9b3619f6-418c-4ffc-81c3-b77245662820-kube-api-access-46gm6\") pod \"auto-csr-approver-29564092-vjjw9\" (UID: \"9b3619f6-418c-4ffc-81c3-b77245662820\") " pod="openshift-infra/auto-csr-approver-29564092-vjjw9" Mar 18 14:52:00 crc kubenswrapper[4756]: I0318 14:52:00.489197 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564092-vjjw9" Mar 18 14:52:01 crc kubenswrapper[4756]: W0318 14:52:01.295530 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b3619f6_418c_4ffc_81c3_b77245662820.slice/crio-3b8a7297727a1a4406fecbecd1138c1b73bcb3a96833fa547eb810d9e7f485ec WatchSource:0}: Error finding container 3b8a7297727a1a4406fecbecd1138c1b73bcb3a96833fa547eb810d9e7f485ec: Status 404 returned error can't find the container with id 3b8a7297727a1a4406fecbecd1138c1b73bcb3a96833fa547eb810d9e7f485ec Mar 18 14:52:01 crc kubenswrapper[4756]: I0318 14:52:01.344136 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564092-vjjw9"] Mar 18 14:52:02 crc kubenswrapper[4756]: I0318 14:52:02.183443 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564092-vjjw9" event={"ID":"9b3619f6-418c-4ffc-81c3-b77245662820","Type":"ContainerStarted","Data":"3b8a7297727a1a4406fecbecd1138c1b73bcb3a96833fa547eb810d9e7f485ec"} Mar 18 14:52:03 crc kubenswrapper[4756]: I0318 14:52:03.194061 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564092-vjjw9" event={"ID":"9b3619f6-418c-4ffc-81c3-b77245662820","Type":"ContainerStarted","Data":"b78738cf7e954838da5435271146ff418efdd1a10ec15cdf2264881f6c5f4948"} Mar 18 14:52:03 crc kubenswrapper[4756]: I0318 14:52:03.215561 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564092-vjjw9" podStartSLOduration=2.033941402 podStartE2EDuration="3.215544928s" podCreationTimestamp="2026-03-18 14:52:00 +0000 UTC" firstStartedPulling="2026-03-18 14:52:01.300278244 +0000 UTC m=+3122.614696219" lastFinishedPulling="2026-03-18 14:52:02.48188177 +0000 UTC m=+3123.796299745" observedRunningTime="2026-03-18 14:52:03.206624926 +0000 UTC m=+3124.521042901" watchObservedRunningTime="2026-03-18 14:52:03.215544928 +0000 UTC m=+3124.529962903" Mar 18 14:52:04 crc kubenswrapper[4756]: I0318 14:52:04.205528 4756 generic.go:334] "Generic (PLEG): container finished" podID="9b3619f6-418c-4ffc-81c3-b77245662820" containerID="b78738cf7e954838da5435271146ff418efdd1a10ec15cdf2264881f6c5f4948" exitCode=0 Mar 18 14:52:04 crc kubenswrapper[4756]: I0318 14:52:04.205620 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564092-vjjw9" event={"ID":"9b3619f6-418c-4ffc-81c3-b77245662820","Type":"ContainerDied","Data":"b78738cf7e954838da5435271146ff418efdd1a10ec15cdf2264881f6c5f4948"} Mar 18 14:52:06 crc kubenswrapper[4756]: I0318 14:52:06.143813 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564092-vjjw9" Mar 18 14:52:06 crc kubenswrapper[4756]: I0318 14:52:06.247558 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564092-vjjw9" event={"ID":"9b3619f6-418c-4ffc-81c3-b77245662820","Type":"ContainerDied","Data":"3b8a7297727a1a4406fecbecd1138c1b73bcb3a96833fa547eb810d9e7f485ec"} Mar 18 14:52:06 crc kubenswrapper[4756]: I0318 14:52:06.247607 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b8a7297727a1a4406fecbecd1138c1b73bcb3a96833fa547eb810d9e7f485ec" Mar 18 14:52:06 crc kubenswrapper[4756]: I0318 14:52:06.247877 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564092-vjjw9" Mar 18 14:52:06 crc kubenswrapper[4756]: I0318 14:52:06.274411 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46gm6\" (UniqueName: \"kubernetes.io/projected/9b3619f6-418c-4ffc-81c3-b77245662820-kube-api-access-46gm6\") pod \"9b3619f6-418c-4ffc-81c3-b77245662820\" (UID: \"9b3619f6-418c-4ffc-81c3-b77245662820\") " Mar 18 14:52:06 crc kubenswrapper[4756]: I0318 14:52:06.283318 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3619f6-418c-4ffc-81c3-b77245662820-kube-api-access-46gm6" (OuterVolumeSpecName: "kube-api-access-46gm6") pod "9b3619f6-418c-4ffc-81c3-b77245662820" (UID: "9b3619f6-418c-4ffc-81c3-b77245662820"). InnerVolumeSpecName "kube-api-access-46gm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:52:06 crc kubenswrapper[4756]: I0318 14:52:06.331184 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564086-mkgvh"] Mar 18 14:52:06 crc kubenswrapper[4756]: I0318 14:52:06.347646 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564086-mkgvh"] Mar 18 14:52:06 crc kubenswrapper[4756]: I0318 14:52:06.385617 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46gm6\" (UniqueName: \"kubernetes.io/projected/9b3619f6-418c-4ffc-81c3-b77245662820-kube-api-access-46gm6\") on node \"crc\" DevicePath \"\"" Mar 18 14:52:07 crc kubenswrapper[4756]: I0318 14:52:07.326111 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cab5bbb-cbae-4857-b0c3-a19146b51fe0" path="/var/lib/kubelet/pods/4cab5bbb-cbae-4857-b0c3-a19146b51fe0/volumes" Mar 18 14:52:34 crc kubenswrapper[4756]: I0318 14:52:34.191679 4756 scope.go:117] "RemoveContainer" containerID="5f0a482f9a661a177b997a179426d47b169578ccc1f70a062f41ce9c081ee3c9" Mar 18 14:52:34 crc kubenswrapper[4756]: I0318 14:52:34.232859 4756 scope.go:117] "RemoveContainer" containerID="d0d51f8b8442556db706d6bc64b53160a9332d57f197db9b64cae98652dbbb86" Mar 18 14:52:34 crc kubenswrapper[4756]: I0318 14:52:34.286133 4756 scope.go:117] "RemoveContainer" containerID="c6f80fc83b82efb2922aff8339f15c8f2ffc4c069440960cbc42bb19e7270fc7" Mar 18 14:52:34 crc kubenswrapper[4756]: I0318 14:52:34.363610 4756 scope.go:117] "RemoveContainer" containerID="ea3dbf9ea57d1bf9191f6e4469df781274267857c88cfb5adec4c5957cc602ab" Mar 18 14:52:36 crc kubenswrapper[4756]: I0318 14:52:36.915441 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:52:36 crc kubenswrapper[4756]: I0318 14:52:36.915815 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:53:06 crc kubenswrapper[4756]: I0318 14:53:06.915666 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:53:06 crc kubenswrapper[4756]: I0318 14:53:06.916299 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:53:36 crc kubenswrapper[4756]: I0318 14:53:36.914854 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:53:36 crc kubenswrapper[4756]: I0318 14:53:36.915482 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:53:36 crc kubenswrapper[4756]: I0318 14:53:36.915690 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 14:53:36 crc kubenswrapper[4756]: I0318 14:53:36.916430 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:53:36 crc kubenswrapper[4756]: I0318 14:53:36.916482 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" gracePeriod=600 Mar 18 14:53:37 crc kubenswrapper[4756]: E0318 14:53:37.044242 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:53:37 crc kubenswrapper[4756]: I0318 14:53:37.133608 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" exitCode=0 Mar 18 14:53:37 crc kubenswrapper[4756]: I0318 14:53:37.133865 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b"} Mar 18 14:53:37 crc kubenswrapper[4756]: I0318 14:53:37.133994 4756 scope.go:117] "RemoveContainer" containerID="c541bb14ab625c993df67e9d02820948c3b385ef7f0e95f5dad40eb319141136" Mar 18 14:53:37 crc kubenswrapper[4756]: I0318 14:53:37.135000 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:53:37 crc kubenswrapper[4756]: E0318 14:53:37.135472 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:53:51 crc kubenswrapper[4756]: I0318 14:53:51.307029 4756 generic.go:334] "Generic (PLEG): container finished" podID="5e78c973-235a-4dcb-927f-7a5d35e786cc" containerID="f7940a5bac3695fdcd7611031f9cc8cfe7f18b40ca9ead9e69a01907b9dffd5e" exitCode=0 Mar 18 14:53:51 crc kubenswrapper[4756]: I0318 14:53:51.307136 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5e78c973-235a-4dcb-927f-7a5d35e786cc","Type":"ContainerDied","Data":"f7940a5bac3695fdcd7611031f9cc8cfe7f18b40ca9ead9e69a01907b9dffd5e"} Mar 18 14:53:51 crc kubenswrapper[4756]: I0318 14:53:51.317254 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:53:51 crc kubenswrapper[4756]: E0318 14:53:51.317524 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.335204 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5e78c973-235a-4dcb-927f-7a5d35e786cc","Type":"ContainerDied","Data":"1116aaa9af7774610bd4a59753895081e68c1369906172eed2323404ff9d6276"} Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.335576 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1116aaa9af7774610bd4a59753895081e68c1369906172eed2323404ff9d6276" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.375543 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.499130 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-ssh-key\") pod \"5e78c973-235a-4dcb-927f-7a5d35e786cc\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.499214 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-openstack-config-secret\") pod \"5e78c973-235a-4dcb-927f-7a5d35e786cc\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.499235 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e78c973-235a-4dcb-927f-7a5d35e786cc-test-operator-ephemeral-workdir\") pod \"5e78c973-235a-4dcb-927f-7a5d35e786cc\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.499272 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e78c973-235a-4dcb-927f-7a5d35e786cc-config-data\") pod \"5e78c973-235a-4dcb-927f-7a5d35e786cc\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.499326 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"5e78c973-235a-4dcb-927f-7a5d35e786cc\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.499472 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtttm\" (UniqueName: \"kubernetes.io/projected/5e78c973-235a-4dcb-927f-7a5d35e786cc-kube-api-access-jtttm\") pod \"5e78c973-235a-4dcb-927f-7a5d35e786cc\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.499496 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e78c973-235a-4dcb-927f-7a5d35e786cc-openstack-config\") pod \"5e78c973-235a-4dcb-927f-7a5d35e786cc\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.499517 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e78c973-235a-4dcb-927f-7a5d35e786cc-test-operator-ephemeral-temporary\") pod \"5e78c973-235a-4dcb-927f-7a5d35e786cc\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.499554 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-ca-certs\") pod \"5e78c973-235a-4dcb-927f-7a5d35e786cc\" (UID: \"5e78c973-235a-4dcb-927f-7a5d35e786cc\") " Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.500376 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e78c973-235a-4dcb-927f-7a5d35e786cc-config-data" (OuterVolumeSpecName: "config-data") pod "5e78c973-235a-4dcb-927f-7a5d35e786cc" (UID: "5e78c973-235a-4dcb-927f-7a5d35e786cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.500376 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e78c973-235a-4dcb-927f-7a5d35e786cc-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "5e78c973-235a-4dcb-927f-7a5d35e786cc" (UID: "5e78c973-235a-4dcb-927f-7a5d35e786cc"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.508267 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e78c973-235a-4dcb-927f-7a5d35e786cc-kube-api-access-jtttm" (OuterVolumeSpecName: "kube-api-access-jtttm") pod "5e78c973-235a-4dcb-927f-7a5d35e786cc" (UID: "5e78c973-235a-4dcb-927f-7a5d35e786cc"). InnerVolumeSpecName "kube-api-access-jtttm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.515332 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "5e78c973-235a-4dcb-927f-7a5d35e786cc" (UID: "5e78c973-235a-4dcb-927f-7a5d35e786cc"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.567093 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "5e78c973-235a-4dcb-927f-7a5d35e786cc" (UID: "5e78c973-235a-4dcb-927f-7a5d35e786cc"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.572064 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5e78c973-235a-4dcb-927f-7a5d35e786cc" (UID: "5e78c973-235a-4dcb-927f-7a5d35e786cc"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.578551 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5e78c973-235a-4dcb-927f-7a5d35e786cc" (UID: "5e78c973-235a-4dcb-927f-7a5d35e786cc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.604842 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.604873 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e78c973-235a-4dcb-927f-7a5d35e786cc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.604909 4756 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.604919 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtttm\" (UniqueName: \"kubernetes.io/projected/5e78c973-235a-4dcb-927f-7a5d35e786cc-kube-api-access-jtttm\") on node \"crc\" DevicePath \"\"" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.604929 4756 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e78c973-235a-4dcb-927f-7a5d35e786cc-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.604941 4756 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.604950 4756 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e78c973-235a-4dcb-927f-7a5d35e786cc-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.605613 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e78c973-235a-4dcb-927f-7a5d35e786cc-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5e78c973-235a-4dcb-927f-7a5d35e786cc" (UID: "5e78c973-235a-4dcb-927f-7a5d35e786cc"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.625936 4756 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.707358 4756 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.707386 4756 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e78c973-235a-4dcb-927f-7a5d35e786cc-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 14:53:53 crc kubenswrapper[4756]: I0318 14:53:53.936017 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e78c973-235a-4dcb-927f-7a5d35e786cc-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "5e78c973-235a-4dcb-927f-7a5d35e786cc" (UID: "5e78c973-235a-4dcb-927f-7a5d35e786cc"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:53:54 crc kubenswrapper[4756]: I0318 14:53:54.012502 4756 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e78c973-235a-4dcb-927f-7a5d35e786cc-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 14:53:54 crc kubenswrapper[4756]: I0318 14:53:54.344792 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.347197 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 14:53:57 crc kubenswrapper[4756]: E0318 14:53:57.347901 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e78c973-235a-4dcb-927f-7a5d35e786cc" containerName="tempest-tests-tempest-tests-runner" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.347914 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e78c973-235a-4dcb-927f-7a5d35e786cc" containerName="tempest-tests-tempest-tests-runner" Mar 18 14:53:57 crc kubenswrapper[4756]: E0318 14:53:57.347952 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3619f6-418c-4ffc-81c3-b77245662820" containerName="oc" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.347961 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3619f6-418c-4ffc-81c3-b77245662820" containerName="oc" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.348227 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e78c973-235a-4dcb-927f-7a5d35e786cc" containerName="tempest-tests-tempest-tests-runner" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.348255 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b3619f6-418c-4ffc-81c3-b77245662820" containerName="oc" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.349287 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.352872 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.353532 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-b452p" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.478208 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qmd5\" (UniqueName: \"kubernetes.io/projected/47cc7818-59ef-442f-9fce-df34f7275bc4-kube-api-access-9qmd5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"47cc7818-59ef-442f-9fce-df34f7275bc4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.478742 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"47cc7818-59ef-442f-9fce-df34f7275bc4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.580771 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"47cc7818-59ef-442f-9fce-df34f7275bc4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.580844 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qmd5\" (UniqueName: \"kubernetes.io/projected/47cc7818-59ef-442f-9fce-df34f7275bc4-kube-api-access-9qmd5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"47cc7818-59ef-442f-9fce-df34f7275bc4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.581433 4756 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"47cc7818-59ef-442f-9fce-df34f7275bc4\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.600437 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qmd5\" (UniqueName: \"kubernetes.io/projected/47cc7818-59ef-442f-9fce-df34f7275bc4-kube-api-access-9qmd5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"47cc7818-59ef-442f-9fce-df34f7275bc4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.629458 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"47cc7818-59ef-442f-9fce-df34f7275bc4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:53:57 crc kubenswrapper[4756]: I0318 14:53:57.683424 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 14:53:58 crc kubenswrapper[4756]: I0318 14:53:58.371435 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 14:53:58 crc kubenswrapper[4756]: I0318 14:53:58.376243 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:53:58 crc kubenswrapper[4756]: I0318 14:53:58.388950 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"47cc7818-59ef-442f-9fce-df34f7275bc4","Type":"ContainerStarted","Data":"73a7073d7882b21cffa9c040f19817107bb39acdfd7b954bd1648a3c67e32a0d"} Mar 18 14:54:00 crc kubenswrapper[4756]: I0318 14:54:00.209553 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564094-kpsmp"] Mar 18 14:54:00 crc kubenswrapper[4756]: I0318 14:54:00.221848 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564094-kpsmp" Mar 18 14:54:00 crc kubenswrapper[4756]: I0318 14:54:00.229744 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:54:00 crc kubenswrapper[4756]: I0318 14:54:00.230001 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:54:00 crc kubenswrapper[4756]: I0318 14:54:00.230221 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:54:00 crc kubenswrapper[4756]: I0318 14:54:00.253576 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564094-kpsmp"] Mar 18 14:54:00 crc kubenswrapper[4756]: I0318 14:54:00.339367 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr4fk\" (UniqueName: \"kubernetes.io/projected/6f63def4-e2ed-400f-952f-ac41a3d00414-kube-api-access-gr4fk\") pod \"auto-csr-approver-29564094-kpsmp\" (UID: \"6f63def4-e2ed-400f-952f-ac41a3d00414\") " pod="openshift-infra/auto-csr-approver-29564094-kpsmp" Mar 18 14:54:00 crc kubenswrapper[4756]: I0318 14:54:00.448850 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr4fk\" (UniqueName: \"kubernetes.io/projected/6f63def4-e2ed-400f-952f-ac41a3d00414-kube-api-access-gr4fk\") pod \"auto-csr-approver-29564094-kpsmp\" (UID: \"6f63def4-e2ed-400f-952f-ac41a3d00414\") " pod="openshift-infra/auto-csr-approver-29564094-kpsmp" Mar 18 14:54:00 crc kubenswrapper[4756]: I0318 14:54:00.480856 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr4fk\" (UniqueName: \"kubernetes.io/projected/6f63def4-e2ed-400f-952f-ac41a3d00414-kube-api-access-gr4fk\") pod \"auto-csr-approver-29564094-kpsmp\" (UID: \"6f63def4-e2ed-400f-952f-ac41a3d00414\") " pod="openshift-infra/auto-csr-approver-29564094-kpsmp" Mar 18 14:54:00 crc kubenswrapper[4756]: I0318 14:54:00.551431 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564094-kpsmp" Mar 18 14:54:01 crc kubenswrapper[4756]: I0318 14:54:01.304129 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564094-kpsmp"] Mar 18 14:54:01 crc kubenswrapper[4756]: W0318 14:54:01.316303 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f63def4_e2ed_400f_952f_ac41a3d00414.slice/crio-777214173d6e856ddfad2bae042427801cf6f2a8ed9c0269fd2a563f5ff75cc9 WatchSource:0}: Error finding container 777214173d6e856ddfad2bae042427801cf6f2a8ed9c0269fd2a563f5ff75cc9: Status 404 returned error can't find the container with id 777214173d6e856ddfad2bae042427801cf6f2a8ed9c0269fd2a563f5ff75cc9 Mar 18 14:54:01 crc kubenswrapper[4756]: I0318 14:54:01.460248 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564094-kpsmp" event={"ID":"6f63def4-e2ed-400f-952f-ac41a3d00414","Type":"ContainerStarted","Data":"777214173d6e856ddfad2bae042427801cf6f2a8ed9c0269fd2a563f5ff75cc9"} Mar 18 14:54:01 crc kubenswrapper[4756]: I0318 14:54:01.462054 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"47cc7818-59ef-442f-9fce-df34f7275bc4","Type":"ContainerStarted","Data":"055b3b2b138a38082758d96515a0462d4a12573cfa28f3cfe92b71fab6de16e6"} Mar 18 14:54:01 crc kubenswrapper[4756]: I0318 14:54:01.486800 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.253796173 podStartE2EDuration="4.486778802s" podCreationTimestamp="2026-03-18 14:53:57 +0000 UTC" firstStartedPulling="2026-03-18 14:53:58.375998027 +0000 UTC m=+3239.690416002" lastFinishedPulling="2026-03-18 14:54:00.608980656 +0000 UTC m=+3241.923398631" observedRunningTime="2026-03-18 14:54:01.474945812 +0000 UTC m=+3242.789363787" watchObservedRunningTime="2026-03-18 14:54:01.486778802 +0000 UTC m=+3242.801196767" Mar 18 14:54:02 crc kubenswrapper[4756]: I0318 14:54:02.317643 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:54:02 crc kubenswrapper[4756]: E0318 14:54:02.318104 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:54:03 crc kubenswrapper[4756]: I0318 14:54:03.484328 4756 generic.go:334] "Generic (PLEG): container finished" podID="6f63def4-e2ed-400f-952f-ac41a3d00414" containerID="512b6fd45de2b3ccc2b133724475a3166dc5b70175af817a19685cf8aa9525ef" exitCode=0 Mar 18 14:54:03 crc kubenswrapper[4756]: I0318 14:54:03.484434 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564094-kpsmp" event={"ID":"6f63def4-e2ed-400f-952f-ac41a3d00414","Type":"ContainerDied","Data":"512b6fd45de2b3ccc2b133724475a3166dc5b70175af817a19685cf8aa9525ef"} Mar 18 14:54:05 crc kubenswrapper[4756]: I0318 14:54:05.502797 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564094-kpsmp" event={"ID":"6f63def4-e2ed-400f-952f-ac41a3d00414","Type":"ContainerDied","Data":"777214173d6e856ddfad2bae042427801cf6f2a8ed9c0269fd2a563f5ff75cc9"} Mar 18 14:54:05 crc kubenswrapper[4756]: I0318 14:54:05.503047 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="777214173d6e856ddfad2bae042427801cf6f2a8ed9c0269fd2a563f5ff75cc9" Mar 18 14:54:05 crc kubenswrapper[4756]: I0318 14:54:05.520719 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564094-kpsmp" Mar 18 14:54:05 crc kubenswrapper[4756]: I0318 14:54:05.703660 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr4fk\" (UniqueName: \"kubernetes.io/projected/6f63def4-e2ed-400f-952f-ac41a3d00414-kube-api-access-gr4fk\") pod \"6f63def4-e2ed-400f-952f-ac41a3d00414\" (UID: \"6f63def4-e2ed-400f-952f-ac41a3d00414\") " Mar 18 14:54:05 crc kubenswrapper[4756]: I0318 14:54:05.723092 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f63def4-e2ed-400f-952f-ac41a3d00414-kube-api-access-gr4fk" (OuterVolumeSpecName: "kube-api-access-gr4fk") pod "6f63def4-e2ed-400f-952f-ac41a3d00414" (UID: "6f63def4-e2ed-400f-952f-ac41a3d00414"). InnerVolumeSpecName "kube-api-access-gr4fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:54:05 crc kubenswrapper[4756]: I0318 14:54:05.807489 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr4fk\" (UniqueName: \"kubernetes.io/projected/6f63def4-e2ed-400f-952f-ac41a3d00414-kube-api-access-gr4fk\") on node \"crc\" DevicePath \"\"" Mar 18 14:54:06 crc kubenswrapper[4756]: I0318 14:54:06.511497 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564094-kpsmp" Mar 18 14:54:06 crc kubenswrapper[4756]: I0318 14:54:06.597863 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564088-2dmgh"] Mar 18 14:54:06 crc kubenswrapper[4756]: I0318 14:54:06.606493 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564088-2dmgh"] Mar 18 14:54:07 crc kubenswrapper[4756]: I0318 14:54:07.325685 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e80e2f-abcf-4daa-84dd-bc3d67b763d9" path="/var/lib/kubelet/pods/a5e80e2f-abcf-4daa-84dd-bc3d67b763d9/volumes" Mar 18 14:54:14 crc kubenswrapper[4756]: I0318 14:54:14.315783 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:54:14 crc kubenswrapper[4756]: E0318 14:54:14.316317 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:54:25 crc kubenswrapper[4756]: I0318 14:54:25.315727 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:54:25 crc kubenswrapper[4756]: E0318 14:54:25.316497 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:54:34 crc kubenswrapper[4756]: I0318 14:54:34.458689 4756 scope.go:117] "RemoveContainer" containerID="9446f84dc26e74a6c1c1b795ce02f223cbed2c5988315b5a12edd0d0481da9c3" Mar 18 14:54:39 crc kubenswrapper[4756]: I0318 14:54:39.338686 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:54:39 crc kubenswrapper[4756]: E0318 14:54:39.339519 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.453769 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m8n6h/must-gather-7s6gq"] Mar 18 14:54:44 crc kubenswrapper[4756]: E0318 14:54:44.454623 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f63def4-e2ed-400f-952f-ac41a3d00414" containerName="oc" Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.454635 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f63def4-e2ed-400f-952f-ac41a3d00414" containerName="oc" Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.454827 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f63def4-e2ed-400f-952f-ac41a3d00414" containerName="oc" Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.455895 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/must-gather-7s6gq" Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.457976 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-m8n6h"/"kube-root-ca.crt" Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.458204 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-m8n6h"/"openshift-service-ca.crt" Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.469537 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m8n6h/must-gather-7s6gq"] Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.548363 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5zv7\" (UniqueName: \"kubernetes.io/projected/b8387997-ea84-4ab9-aecd-315e77e225e0-kube-api-access-h5zv7\") pod \"must-gather-7s6gq\" (UID: \"b8387997-ea84-4ab9-aecd-315e77e225e0\") " pod="openshift-must-gather-m8n6h/must-gather-7s6gq" Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.548824 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8387997-ea84-4ab9-aecd-315e77e225e0-must-gather-output\") pod \"must-gather-7s6gq\" (UID: \"b8387997-ea84-4ab9-aecd-315e77e225e0\") " pod="openshift-must-gather-m8n6h/must-gather-7s6gq" Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.651488 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8387997-ea84-4ab9-aecd-315e77e225e0-must-gather-output\") pod \"must-gather-7s6gq\" (UID: \"b8387997-ea84-4ab9-aecd-315e77e225e0\") " pod="openshift-must-gather-m8n6h/must-gather-7s6gq" Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.651592 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5zv7\" (UniqueName: \"kubernetes.io/projected/b8387997-ea84-4ab9-aecd-315e77e225e0-kube-api-access-h5zv7\") pod \"must-gather-7s6gq\" (UID: \"b8387997-ea84-4ab9-aecd-315e77e225e0\") " pod="openshift-must-gather-m8n6h/must-gather-7s6gq" Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.651952 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8387997-ea84-4ab9-aecd-315e77e225e0-must-gather-output\") pod \"must-gather-7s6gq\" (UID: \"b8387997-ea84-4ab9-aecd-315e77e225e0\") " pod="openshift-must-gather-m8n6h/must-gather-7s6gq" Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.676770 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5zv7\" (UniqueName: \"kubernetes.io/projected/b8387997-ea84-4ab9-aecd-315e77e225e0-kube-api-access-h5zv7\") pod \"must-gather-7s6gq\" (UID: \"b8387997-ea84-4ab9-aecd-315e77e225e0\") " pod="openshift-must-gather-m8n6h/must-gather-7s6gq" Mar 18 14:54:44 crc kubenswrapper[4756]: I0318 14:54:44.776826 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/must-gather-7s6gq" Mar 18 14:54:45 crc kubenswrapper[4756]: I0318 14:54:45.615247 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m8n6h/must-gather-7s6gq"] Mar 18 14:54:45 crc kubenswrapper[4756]: I0318 14:54:45.861719 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m8n6h/must-gather-7s6gq" event={"ID":"b8387997-ea84-4ab9-aecd-315e77e225e0","Type":"ContainerStarted","Data":"dbd485bef9f40654ff4658f03d54b7403daa20486416497dc52cef0237017f7f"} Mar 18 14:54:52 crc kubenswrapper[4756]: I0318 14:54:52.318052 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:54:52 crc kubenswrapper[4756]: E0318 14:54:52.318808 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:55:02 crc kubenswrapper[4756]: I0318 14:55:02.040256 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m8n6h/must-gather-7s6gq" event={"ID":"b8387997-ea84-4ab9-aecd-315e77e225e0","Type":"ContainerStarted","Data":"bb3f759479d9429141ba68eda66e087e47db88d50f2abaaea32bda4171df6146"} Mar 18 14:55:02 crc kubenswrapper[4756]: I0318 14:55:02.040676 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m8n6h/must-gather-7s6gq" event={"ID":"b8387997-ea84-4ab9-aecd-315e77e225e0","Type":"ContainerStarted","Data":"15934e12b5bda09c6103189b8477de1b9da42aea68338899bdb89c4830bc1b26"} Mar 18 14:55:02 crc kubenswrapper[4756]: I0318 14:55:02.059127 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m8n6h/must-gather-7s6gq" podStartSLOduration=2.681630518 podStartE2EDuration="18.059096403s" podCreationTimestamp="2026-03-18 14:54:44 +0000 UTC" firstStartedPulling="2026-03-18 14:54:45.629256627 +0000 UTC m=+3286.943674602" lastFinishedPulling="2026-03-18 14:55:01.006722512 +0000 UTC m=+3302.321140487" observedRunningTime="2026-03-18 14:55:02.056098042 +0000 UTC m=+3303.370516037" watchObservedRunningTime="2026-03-18 14:55:02.059096403 +0000 UTC m=+3303.373514378" Mar 18 14:55:07 crc kubenswrapper[4756]: I0318 14:55:07.322053 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:55:07 crc kubenswrapper[4756]: E0318 14:55:07.322843 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:55:08 crc kubenswrapper[4756]: I0318 14:55:08.831184 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m8n6h/crc-debug-s5c5w"] Mar 18 14:55:08 crc kubenswrapper[4756]: I0318 14:55:08.832481 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" Mar 18 14:55:08 crc kubenswrapper[4756]: I0318 14:55:08.834774 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-m8n6h"/"default-dockercfg-l8xhl" Mar 18 14:55:08 crc kubenswrapper[4756]: I0318 14:55:08.887928 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5v7m\" (UniqueName: \"kubernetes.io/projected/2539c4f7-fc16-442b-8f7b-54e4fa71470c-kube-api-access-j5v7m\") pod \"crc-debug-s5c5w\" (UID: \"2539c4f7-fc16-442b-8f7b-54e4fa71470c\") " pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" Mar 18 14:55:08 crc kubenswrapper[4756]: I0318 14:55:08.888308 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2539c4f7-fc16-442b-8f7b-54e4fa71470c-host\") pod \"crc-debug-s5c5w\" (UID: \"2539c4f7-fc16-442b-8f7b-54e4fa71470c\") " pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" Mar 18 14:55:08 crc kubenswrapper[4756]: I0318 14:55:08.989885 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5v7m\" (UniqueName: \"kubernetes.io/projected/2539c4f7-fc16-442b-8f7b-54e4fa71470c-kube-api-access-j5v7m\") pod \"crc-debug-s5c5w\" (UID: \"2539c4f7-fc16-442b-8f7b-54e4fa71470c\") " pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" Mar 18 14:55:08 crc kubenswrapper[4756]: I0318 14:55:08.990003 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2539c4f7-fc16-442b-8f7b-54e4fa71470c-host\") pod \"crc-debug-s5c5w\" (UID: \"2539c4f7-fc16-442b-8f7b-54e4fa71470c\") " pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" Mar 18 14:55:08 crc kubenswrapper[4756]: I0318 14:55:08.990187 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2539c4f7-fc16-442b-8f7b-54e4fa71470c-host\") pod \"crc-debug-s5c5w\" (UID: \"2539c4f7-fc16-442b-8f7b-54e4fa71470c\") " pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" Mar 18 14:55:09 crc kubenswrapper[4756]: I0318 14:55:09.007425 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5v7m\" (UniqueName: \"kubernetes.io/projected/2539c4f7-fc16-442b-8f7b-54e4fa71470c-kube-api-access-j5v7m\") pod \"crc-debug-s5c5w\" (UID: \"2539c4f7-fc16-442b-8f7b-54e4fa71470c\") " pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" Mar 18 14:55:09 crc kubenswrapper[4756]: I0318 14:55:09.150454 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" Mar 18 14:55:10 crc kubenswrapper[4756]: I0318 14:55:10.124027 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" event={"ID":"2539c4f7-fc16-442b-8f7b-54e4fa71470c","Type":"ContainerStarted","Data":"3272a2907fa690454e32917fd3a4a99d4f6c20c194291f2e30995e3bf2e11011"} Mar 18 14:55:21 crc kubenswrapper[4756]: I0318 14:55:21.315683 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:55:21 crc kubenswrapper[4756]: E0318 14:55:21.316478 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:55:29 crc kubenswrapper[4756]: E0318 14:55:29.880708 4756 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Mar 18 14:55:29 crc kubenswrapper[4756]: E0318 14:55:29.882144 4756 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j5v7m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-s5c5w_openshift-must-gather-m8n6h(2539c4f7-fc16-442b-8f7b-54e4fa71470c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 14:55:29 crc kubenswrapper[4756]: E0318 14:55:29.883325 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" podUID="2539c4f7-fc16-442b-8f7b-54e4fa71470c" Mar 18 14:55:30 crc kubenswrapper[4756]: E0318 14:55:30.332187 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" podUID="2539c4f7-fc16-442b-8f7b-54e4fa71470c" Mar 18 14:55:34 crc kubenswrapper[4756]: I0318 14:55:34.316006 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:55:34 crc kubenswrapper[4756]: E0318 14:55:34.316877 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:55:47 crc kubenswrapper[4756]: I0318 14:55:47.473824 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" event={"ID":"2539c4f7-fc16-442b-8f7b-54e4fa71470c","Type":"ContainerStarted","Data":"b307208590d4faab1576bde2d6ec771d0e43e3062bb8529045c6c10ac0782c02"} Mar 18 14:55:47 crc kubenswrapper[4756]: I0318 14:55:47.495293 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" podStartSLOduration=1.6549403919999999 podStartE2EDuration="39.495273055s" podCreationTimestamp="2026-03-18 14:55:08 +0000 UTC" firstStartedPulling="2026-03-18 14:55:09.217965221 +0000 UTC m=+3310.532383196" lastFinishedPulling="2026-03-18 14:55:47.058297884 +0000 UTC m=+3348.372715859" observedRunningTime="2026-03-18 14:55:47.493440915 +0000 UTC m=+3348.807858900" watchObservedRunningTime="2026-03-18 14:55:47.495273055 +0000 UTC m=+3348.809691030" Mar 18 14:55:48 crc kubenswrapper[4756]: I0318 14:55:48.316548 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:55:48 crc kubenswrapper[4756]: E0318 14:55:48.317024 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:56:00 crc kubenswrapper[4756]: I0318 14:56:00.153768 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564096-fbk77"] Mar 18 14:56:00 crc kubenswrapper[4756]: I0318 14:56:00.155600 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564096-fbk77" Mar 18 14:56:00 crc kubenswrapper[4756]: I0318 14:56:00.158975 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:56:00 crc kubenswrapper[4756]: I0318 14:56:00.159036 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:56:00 crc kubenswrapper[4756]: I0318 14:56:00.159257 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:56:00 crc kubenswrapper[4756]: I0318 14:56:00.163948 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564096-fbk77"] Mar 18 14:56:00 crc kubenswrapper[4756]: I0318 14:56:00.258371 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6xnf\" (UniqueName: \"kubernetes.io/projected/f93b0b30-b12f-4891-9245-9a2878902bbd-kube-api-access-c6xnf\") pod \"auto-csr-approver-29564096-fbk77\" (UID: \"f93b0b30-b12f-4891-9245-9a2878902bbd\") " pod="openshift-infra/auto-csr-approver-29564096-fbk77" Mar 18 14:56:00 crc kubenswrapper[4756]: I0318 14:56:00.315940 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:56:00 crc kubenswrapper[4756]: E0318 14:56:00.316201 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:56:00 crc kubenswrapper[4756]: I0318 14:56:00.360704 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6xnf\" (UniqueName: \"kubernetes.io/projected/f93b0b30-b12f-4891-9245-9a2878902bbd-kube-api-access-c6xnf\") pod \"auto-csr-approver-29564096-fbk77\" (UID: \"f93b0b30-b12f-4891-9245-9a2878902bbd\") " pod="openshift-infra/auto-csr-approver-29564096-fbk77" Mar 18 14:56:00 crc kubenswrapper[4756]: I0318 14:56:00.396761 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6xnf\" (UniqueName: \"kubernetes.io/projected/f93b0b30-b12f-4891-9245-9a2878902bbd-kube-api-access-c6xnf\") pod \"auto-csr-approver-29564096-fbk77\" (UID: \"f93b0b30-b12f-4891-9245-9a2878902bbd\") " pod="openshift-infra/auto-csr-approver-29564096-fbk77" Mar 18 14:56:00 crc kubenswrapper[4756]: I0318 14:56:00.494777 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564096-fbk77" Mar 18 14:56:02 crc kubenswrapper[4756]: W0318 14:56:02.555502 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf93b0b30_b12f_4891_9245_9a2878902bbd.slice/crio-751f298b8d5609139638e18b5d609832d9976c80aa8659e47021007f82b26111 WatchSource:0}: Error finding container 751f298b8d5609139638e18b5d609832d9976c80aa8659e47021007f82b26111: Status 404 returned error can't find the container with id 751f298b8d5609139638e18b5d609832d9976c80aa8659e47021007f82b26111 Mar 18 14:56:02 crc kubenswrapper[4756]: I0318 14:56:02.564845 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564096-fbk77"] Mar 18 14:56:02 crc kubenswrapper[4756]: I0318 14:56:02.615692 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564096-fbk77" event={"ID":"f93b0b30-b12f-4891-9245-9a2878902bbd","Type":"ContainerStarted","Data":"751f298b8d5609139638e18b5d609832d9976c80aa8659e47021007f82b26111"} Mar 18 14:56:04 crc kubenswrapper[4756]: I0318 14:56:04.631902 4756 generic.go:334] "Generic (PLEG): container finished" podID="f93b0b30-b12f-4891-9245-9a2878902bbd" containerID="df9bc9c3728533fd8250aa1a3aa67b047c41ec19cfc177ab862a68ed1e57aba7" exitCode=0 Mar 18 14:56:04 crc kubenswrapper[4756]: I0318 14:56:04.632087 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564096-fbk77" event={"ID":"f93b0b30-b12f-4891-9245-9a2878902bbd","Type":"ContainerDied","Data":"df9bc9c3728533fd8250aa1a3aa67b047c41ec19cfc177ab862a68ed1e57aba7"} Mar 18 14:56:06 crc kubenswrapper[4756]: I0318 14:56:06.768968 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564096-fbk77" Mar 18 14:56:06 crc kubenswrapper[4756]: I0318 14:56:06.915668 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6xnf\" (UniqueName: \"kubernetes.io/projected/f93b0b30-b12f-4891-9245-9a2878902bbd-kube-api-access-c6xnf\") pod \"f93b0b30-b12f-4891-9245-9a2878902bbd\" (UID: \"f93b0b30-b12f-4891-9245-9a2878902bbd\") " Mar 18 14:56:06 crc kubenswrapper[4756]: I0318 14:56:06.921969 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93b0b30-b12f-4891-9245-9a2878902bbd-kube-api-access-c6xnf" (OuterVolumeSpecName: "kube-api-access-c6xnf") pod "f93b0b30-b12f-4891-9245-9a2878902bbd" (UID: "f93b0b30-b12f-4891-9245-9a2878902bbd"). InnerVolumeSpecName "kube-api-access-c6xnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:56:07 crc kubenswrapper[4756]: I0318 14:56:07.018260 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6xnf\" (UniqueName: \"kubernetes.io/projected/f93b0b30-b12f-4891-9245-9a2878902bbd-kube-api-access-c6xnf\") on node \"crc\" DevicePath \"\"" Mar 18 14:56:07 crc kubenswrapper[4756]: I0318 14:56:07.664716 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564096-fbk77" event={"ID":"f93b0b30-b12f-4891-9245-9a2878902bbd","Type":"ContainerDied","Data":"751f298b8d5609139638e18b5d609832d9976c80aa8659e47021007f82b26111"} Mar 18 14:56:07 crc kubenswrapper[4756]: I0318 14:56:07.664754 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="751f298b8d5609139638e18b5d609832d9976c80aa8659e47021007f82b26111" Mar 18 14:56:07 crc kubenswrapper[4756]: I0318 14:56:07.664770 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564096-fbk77" Mar 18 14:56:07 crc kubenswrapper[4756]: I0318 14:56:07.837693 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564090-697tj"] Mar 18 14:56:07 crc kubenswrapper[4756]: I0318 14:56:07.850505 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564090-697tj"] Mar 18 14:56:09 crc kubenswrapper[4756]: I0318 14:56:09.326262 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eead16c4-568f-4945-ac43-52a7138e6db8" path="/var/lib/kubelet/pods/eead16c4-568f-4945-ac43-52a7138e6db8/volumes" Mar 18 14:56:14 crc kubenswrapper[4756]: I0318 14:56:14.316076 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:56:14 crc kubenswrapper[4756]: E0318 14:56:14.316934 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:56:29 crc kubenswrapper[4756]: I0318 14:56:29.321364 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:56:29 crc kubenswrapper[4756]: E0318 14:56:29.322057 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.250556 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rs8s7"] Mar 18 14:56:31 crc kubenswrapper[4756]: E0318 14:56:31.251289 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93b0b30-b12f-4891-9245-9a2878902bbd" containerName="oc" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.251302 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93b0b30-b12f-4891-9245-9a2878902bbd" containerName="oc" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.251541 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93b0b30-b12f-4891-9245-9a2878902bbd" containerName="oc" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.252995 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.264719 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rs8s7"] Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.346519 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwdjt\" (UniqueName: \"kubernetes.io/projected/abab2751-a427-4e83-a7ef-7393eed94c71-kube-api-access-dwdjt\") pod \"redhat-operators-rs8s7\" (UID: \"abab2751-a427-4e83-a7ef-7393eed94c71\") " pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.346568 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abab2751-a427-4e83-a7ef-7393eed94c71-utilities\") pod \"redhat-operators-rs8s7\" (UID: \"abab2751-a427-4e83-a7ef-7393eed94c71\") " pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.346744 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abab2751-a427-4e83-a7ef-7393eed94c71-catalog-content\") pod \"redhat-operators-rs8s7\" (UID: \"abab2751-a427-4e83-a7ef-7393eed94c71\") " pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.449316 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abab2751-a427-4e83-a7ef-7393eed94c71-utilities\") pod \"redhat-operators-rs8s7\" (UID: \"abab2751-a427-4e83-a7ef-7393eed94c71\") " pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.449517 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abab2751-a427-4e83-a7ef-7393eed94c71-catalog-content\") pod \"redhat-operators-rs8s7\" (UID: \"abab2751-a427-4e83-a7ef-7393eed94c71\") " pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.449543 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwdjt\" (UniqueName: \"kubernetes.io/projected/abab2751-a427-4e83-a7ef-7393eed94c71-kube-api-access-dwdjt\") pod \"redhat-operators-rs8s7\" (UID: \"abab2751-a427-4e83-a7ef-7393eed94c71\") " pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.450347 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abab2751-a427-4e83-a7ef-7393eed94c71-utilities\") pod \"redhat-operators-rs8s7\" (UID: \"abab2751-a427-4e83-a7ef-7393eed94c71\") " pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.450557 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abab2751-a427-4e83-a7ef-7393eed94c71-catalog-content\") pod \"redhat-operators-rs8s7\" (UID: \"abab2751-a427-4e83-a7ef-7393eed94c71\") " pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.472790 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwdjt\" (UniqueName: \"kubernetes.io/projected/abab2751-a427-4e83-a7ef-7393eed94c71-kube-api-access-dwdjt\") pod \"redhat-operators-rs8s7\" (UID: \"abab2751-a427-4e83-a7ef-7393eed94c71\") " pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:56:31 crc kubenswrapper[4756]: I0318 14:56:31.617195 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:56:32 crc kubenswrapper[4756]: I0318 14:56:32.437140 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rs8s7"] Mar 18 14:56:32 crc kubenswrapper[4756]: I0318 14:56:32.963482 4756 generic.go:334] "Generic (PLEG): container finished" podID="abab2751-a427-4e83-a7ef-7393eed94c71" containerID="19ad716af839482e84ea38fc7d2df2aa09ca3d17323b40a6aabcaa425f3c5a0f" exitCode=0 Mar 18 14:56:32 crc kubenswrapper[4756]: I0318 14:56:32.963653 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs8s7" event={"ID":"abab2751-a427-4e83-a7ef-7393eed94c71","Type":"ContainerDied","Data":"19ad716af839482e84ea38fc7d2df2aa09ca3d17323b40a6aabcaa425f3c5a0f"} Mar 18 14:56:32 crc kubenswrapper[4756]: I0318 14:56:32.964245 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs8s7" event={"ID":"abab2751-a427-4e83-a7ef-7393eed94c71","Type":"ContainerStarted","Data":"c76ff0404e4e0356eb534558ca77093e1c0b62de543345f0cc6a6e8120a49b41"} Mar 18 14:56:34 crc kubenswrapper[4756]: I0318 14:56:34.552155 4756 scope.go:117] "RemoveContainer" containerID="9c7bc4985276144d5c63fcfe46a497b63a10463e6ebc79f6cd96f794b1f7d192" Mar 18 14:56:35 crc kubenswrapper[4756]: I0318 14:56:35.995985 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs8s7" event={"ID":"abab2751-a427-4e83-a7ef-7393eed94c71","Type":"ContainerStarted","Data":"faa9f41709e23c7a597da8f5e9747ad0367651efd12b7ea89c0614567c751b4c"} Mar 18 14:56:41 crc kubenswrapper[4756]: I0318 14:56:41.315762 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:56:41 crc kubenswrapper[4756]: E0318 14:56:41.316604 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:56:42 crc kubenswrapper[4756]: I0318 14:56:42.057123 4756 generic.go:334] "Generic (PLEG): container finished" podID="abab2751-a427-4e83-a7ef-7393eed94c71" containerID="faa9f41709e23c7a597da8f5e9747ad0367651efd12b7ea89c0614567c751b4c" exitCode=0 Mar 18 14:56:42 crc kubenswrapper[4756]: I0318 14:56:42.057183 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs8s7" event={"ID":"abab2751-a427-4e83-a7ef-7393eed94c71","Type":"ContainerDied","Data":"faa9f41709e23c7a597da8f5e9747ad0367651efd12b7ea89c0614567c751b4c"} Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.084454 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs8s7" event={"ID":"abab2751-a427-4e83-a7ef-7393eed94c71","Type":"ContainerStarted","Data":"4da3470224078e85fc26dc0700f936f6d0b1cf6df78d569e43ddd15c296579da"} Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.107473 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rs8s7" podStartSLOduration=2.882377276 podStartE2EDuration="13.107455616s" podCreationTimestamp="2026-03-18 14:56:31 +0000 UTC" firstStartedPulling="2026-03-18 14:56:32.965528076 +0000 UTC m=+3394.279946051" lastFinishedPulling="2026-03-18 14:56:43.190606416 +0000 UTC m=+3404.505024391" observedRunningTime="2026-03-18 14:56:44.10169518 +0000 UTC m=+3405.416113165" watchObservedRunningTime="2026-03-18 14:56:44.107455616 +0000 UTC m=+3405.421873591" Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.452396 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wjdgp"] Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.454622 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.463017 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wjdgp"] Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.536552 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10846cc4-9288-4f06-a0d7-2cbf1108171d-utilities\") pod \"community-operators-wjdgp\" (UID: \"10846cc4-9288-4f06-a0d7-2cbf1108171d\") " pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.536627 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgplx\" (UniqueName: \"kubernetes.io/projected/10846cc4-9288-4f06-a0d7-2cbf1108171d-kube-api-access-hgplx\") pod \"community-operators-wjdgp\" (UID: \"10846cc4-9288-4f06-a0d7-2cbf1108171d\") " pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.536695 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10846cc4-9288-4f06-a0d7-2cbf1108171d-catalog-content\") pod \"community-operators-wjdgp\" (UID: \"10846cc4-9288-4f06-a0d7-2cbf1108171d\") " pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.638327 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10846cc4-9288-4f06-a0d7-2cbf1108171d-catalog-content\") pod \"community-operators-wjdgp\" (UID: \"10846cc4-9288-4f06-a0d7-2cbf1108171d\") " pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.638487 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10846cc4-9288-4f06-a0d7-2cbf1108171d-utilities\") pod \"community-operators-wjdgp\" (UID: \"10846cc4-9288-4f06-a0d7-2cbf1108171d\") " pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.638530 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgplx\" (UniqueName: \"kubernetes.io/projected/10846cc4-9288-4f06-a0d7-2cbf1108171d-kube-api-access-hgplx\") pod \"community-operators-wjdgp\" (UID: \"10846cc4-9288-4f06-a0d7-2cbf1108171d\") " pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.638882 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10846cc4-9288-4f06-a0d7-2cbf1108171d-catalog-content\") pod \"community-operators-wjdgp\" (UID: \"10846cc4-9288-4f06-a0d7-2cbf1108171d\") " pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.639000 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10846cc4-9288-4f06-a0d7-2cbf1108171d-utilities\") pod \"community-operators-wjdgp\" (UID: \"10846cc4-9288-4f06-a0d7-2cbf1108171d\") " pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.659489 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgplx\" (UniqueName: \"kubernetes.io/projected/10846cc4-9288-4f06-a0d7-2cbf1108171d-kube-api-access-hgplx\") pod \"community-operators-wjdgp\" (UID: \"10846cc4-9288-4f06-a0d7-2cbf1108171d\") " pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:44 crc kubenswrapper[4756]: I0318 14:56:44.777101 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:45 crc kubenswrapper[4756]: I0318 14:56:45.657331 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wjdgp"] Mar 18 14:56:46 crc kubenswrapper[4756]: I0318 14:56:46.138739 4756 generic.go:334] "Generic (PLEG): container finished" podID="10846cc4-9288-4f06-a0d7-2cbf1108171d" containerID="b58d76af1e4a8b3b3a281e5a2923ce6af1cafe24d8c8a678125d9aa37b0fe35a" exitCode=0 Mar 18 14:56:46 crc kubenswrapper[4756]: I0318 14:56:46.138817 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjdgp" event={"ID":"10846cc4-9288-4f06-a0d7-2cbf1108171d","Type":"ContainerDied","Data":"b58d76af1e4a8b3b3a281e5a2923ce6af1cafe24d8c8a678125d9aa37b0fe35a"} Mar 18 14:56:46 crc kubenswrapper[4756]: I0318 14:56:46.139097 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjdgp" event={"ID":"10846cc4-9288-4f06-a0d7-2cbf1108171d","Type":"ContainerStarted","Data":"e6221008d60e70b5b7a485eb1a739bf345fc5d8e883fb6fbae22e32c257dd5f6"} Mar 18 14:56:49 crc kubenswrapper[4756]: I0318 14:56:49.172390 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjdgp" event={"ID":"10846cc4-9288-4f06-a0d7-2cbf1108171d","Type":"ContainerStarted","Data":"27b69a979794adbe19d5ce8a4c388923baf62dce181589542e58cb196db4ca20"} Mar 18 14:56:50 crc kubenswrapper[4756]: I0318 14:56:50.183434 4756 generic.go:334] "Generic (PLEG): container finished" podID="10846cc4-9288-4f06-a0d7-2cbf1108171d" containerID="27b69a979794adbe19d5ce8a4c388923baf62dce181589542e58cb196db4ca20" exitCode=0 Mar 18 14:56:50 crc kubenswrapper[4756]: I0318 14:56:50.183624 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjdgp" event={"ID":"10846cc4-9288-4f06-a0d7-2cbf1108171d","Type":"ContainerDied","Data":"27b69a979794adbe19d5ce8a4c388923baf62dce181589542e58cb196db4ca20"} Mar 18 14:56:51 crc kubenswrapper[4756]: I0318 14:56:51.619100 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:56:51 crc kubenswrapper[4756]: I0318 14:56:51.619842 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:56:52 crc kubenswrapper[4756]: I0318 14:56:52.668470 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rs8s7" podUID="abab2751-a427-4e83-a7ef-7393eed94c71" containerName="registry-server" probeResult="failure" output=< Mar 18 14:56:52 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:56:52 crc kubenswrapper[4756]: > Mar 18 14:56:53 crc kubenswrapper[4756]: I0318 14:56:53.222453 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjdgp" event={"ID":"10846cc4-9288-4f06-a0d7-2cbf1108171d","Type":"ContainerStarted","Data":"5ddcf97201de03a711b12601671e3cacd6df0652c1a3cb760d1fd8f9b24a99fd"} Mar 18 14:56:53 crc kubenswrapper[4756]: I0318 14:56:53.262517 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wjdgp" podStartSLOduration=3.106934761 podStartE2EDuration="9.262500401s" podCreationTimestamp="2026-03-18 14:56:44 +0000 UTC" firstStartedPulling="2026-03-18 14:56:46.140794017 +0000 UTC m=+3407.455211992" lastFinishedPulling="2026-03-18 14:56:52.296359657 +0000 UTC m=+3413.610777632" observedRunningTime="2026-03-18 14:56:53.256472197 +0000 UTC m=+3414.570890172" watchObservedRunningTime="2026-03-18 14:56:53.262500401 +0000 UTC m=+3414.576918376" Mar 18 14:56:54 crc kubenswrapper[4756]: I0318 14:56:54.778049 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:54 crc kubenswrapper[4756]: I0318 14:56:54.778106 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:54 crc kubenswrapper[4756]: I0318 14:56:54.826608 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:56:56 crc kubenswrapper[4756]: I0318 14:56:56.315071 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:56:56 crc kubenswrapper[4756]: E0318 14:56:56.315356 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:57:02 crc kubenswrapper[4756]: I0318 14:57:02.298348 4756 generic.go:334] "Generic (PLEG): container finished" podID="2539c4f7-fc16-442b-8f7b-54e4fa71470c" containerID="b307208590d4faab1576bde2d6ec771d0e43e3062bb8529045c6c10ac0782c02" exitCode=0 Mar 18 14:57:02 crc kubenswrapper[4756]: I0318 14:57:02.298423 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" event={"ID":"2539c4f7-fc16-442b-8f7b-54e4fa71470c","Type":"ContainerDied","Data":"b307208590d4faab1576bde2d6ec771d0e43e3062bb8529045c6c10ac0782c02"} Mar 18 14:57:02 crc kubenswrapper[4756]: I0318 14:57:02.670770 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rs8s7" podUID="abab2751-a427-4e83-a7ef-7393eed94c71" containerName="registry-server" probeResult="failure" output=< Mar 18 14:57:02 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:57:02 crc kubenswrapper[4756]: > Mar 18 14:57:03 crc kubenswrapper[4756]: I0318 14:57:03.424107 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" Mar 18 14:57:03 crc kubenswrapper[4756]: I0318 14:57:03.470201 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m8n6h/crc-debug-s5c5w"] Mar 18 14:57:03 crc kubenswrapper[4756]: I0318 14:57:03.478636 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m8n6h/crc-debug-s5c5w"] Mar 18 14:57:03 crc kubenswrapper[4756]: I0318 14:57:03.549644 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2539c4f7-fc16-442b-8f7b-54e4fa71470c-host\") pod \"2539c4f7-fc16-442b-8f7b-54e4fa71470c\" (UID: \"2539c4f7-fc16-442b-8f7b-54e4fa71470c\") " Mar 18 14:57:03 crc kubenswrapper[4756]: I0318 14:57:03.549784 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2539c4f7-fc16-442b-8f7b-54e4fa71470c-host" (OuterVolumeSpecName: "host") pod "2539c4f7-fc16-442b-8f7b-54e4fa71470c" (UID: "2539c4f7-fc16-442b-8f7b-54e4fa71470c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:57:03 crc kubenswrapper[4756]: I0318 14:57:03.549991 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5v7m\" (UniqueName: \"kubernetes.io/projected/2539c4f7-fc16-442b-8f7b-54e4fa71470c-kube-api-access-j5v7m\") pod \"2539c4f7-fc16-442b-8f7b-54e4fa71470c\" (UID: \"2539c4f7-fc16-442b-8f7b-54e4fa71470c\") " Mar 18 14:57:03 crc kubenswrapper[4756]: I0318 14:57:03.550426 4756 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2539c4f7-fc16-442b-8f7b-54e4fa71470c-host\") on node \"crc\" DevicePath \"\"" Mar 18 14:57:03 crc kubenswrapper[4756]: I0318 14:57:03.557269 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2539c4f7-fc16-442b-8f7b-54e4fa71470c-kube-api-access-j5v7m" (OuterVolumeSpecName: "kube-api-access-j5v7m") pod "2539c4f7-fc16-442b-8f7b-54e4fa71470c" (UID: "2539c4f7-fc16-442b-8f7b-54e4fa71470c"). InnerVolumeSpecName "kube-api-access-j5v7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:57:03 crc kubenswrapper[4756]: I0318 14:57:03.652750 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5v7m\" (UniqueName: \"kubernetes.io/projected/2539c4f7-fc16-442b-8f7b-54e4fa71470c-kube-api-access-j5v7m\") on node \"crc\" DevicePath \"\"" Mar 18 14:57:04 crc kubenswrapper[4756]: I0318 14:57:04.317317 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3272a2907fa690454e32917fd3a4a99d4f6c20c194291f2e30995e3bf2e11011" Mar 18 14:57:04 crc kubenswrapper[4756]: I0318 14:57:04.317359 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/crc-debug-s5c5w" Mar 18 14:57:04 crc kubenswrapper[4756]: I0318 14:57:04.835894 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:57:04 crc kubenswrapper[4756]: I0318 14:57:04.867533 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m8n6h/crc-debug-5fq88"] Mar 18 14:57:04 crc kubenswrapper[4756]: E0318 14:57:04.867992 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2539c4f7-fc16-442b-8f7b-54e4fa71470c" containerName="container-00" Mar 18 14:57:04 crc kubenswrapper[4756]: I0318 14:57:04.868021 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2539c4f7-fc16-442b-8f7b-54e4fa71470c" containerName="container-00" Mar 18 14:57:04 crc kubenswrapper[4756]: I0318 14:57:04.868245 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2539c4f7-fc16-442b-8f7b-54e4fa71470c" containerName="container-00" Mar 18 14:57:04 crc kubenswrapper[4756]: I0318 14:57:04.868910 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/crc-debug-5fq88" Mar 18 14:57:04 crc kubenswrapper[4756]: I0318 14:57:04.876713 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-m8n6h"/"default-dockercfg-l8xhl" Mar 18 14:57:04 crc kubenswrapper[4756]: I0318 14:57:04.896683 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wjdgp"] Mar 18 14:57:04 crc kubenswrapper[4756]: I0318 14:57:04.977490 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f232e407-2c28-42d7-a681-9ec185068eaa-host\") pod \"crc-debug-5fq88\" (UID: \"f232e407-2c28-42d7-a681-9ec185068eaa\") " pod="openshift-must-gather-m8n6h/crc-debug-5fq88" Mar 18 14:57:04 crc kubenswrapper[4756]: I0318 14:57:04.977656 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk6xc\" (UniqueName: \"kubernetes.io/projected/f232e407-2c28-42d7-a681-9ec185068eaa-kube-api-access-fk6xc\") pod \"crc-debug-5fq88\" (UID: \"f232e407-2c28-42d7-a681-9ec185068eaa\") " pod="openshift-must-gather-m8n6h/crc-debug-5fq88" Mar 18 14:57:05 crc kubenswrapper[4756]: I0318 14:57:05.079608 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk6xc\" (UniqueName: \"kubernetes.io/projected/f232e407-2c28-42d7-a681-9ec185068eaa-kube-api-access-fk6xc\") pod \"crc-debug-5fq88\" (UID: \"f232e407-2c28-42d7-a681-9ec185068eaa\") " pod="openshift-must-gather-m8n6h/crc-debug-5fq88" Mar 18 14:57:05 crc kubenswrapper[4756]: I0318 14:57:05.079743 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f232e407-2c28-42d7-a681-9ec185068eaa-host\") pod \"crc-debug-5fq88\" (UID: \"f232e407-2c28-42d7-a681-9ec185068eaa\") " pod="openshift-must-gather-m8n6h/crc-debug-5fq88" Mar 18 14:57:05 crc kubenswrapper[4756]: I0318 14:57:05.079904 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f232e407-2c28-42d7-a681-9ec185068eaa-host\") pod \"crc-debug-5fq88\" (UID: \"f232e407-2c28-42d7-a681-9ec185068eaa\") " pod="openshift-must-gather-m8n6h/crc-debug-5fq88" Mar 18 14:57:05 crc kubenswrapper[4756]: I0318 14:57:05.096801 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk6xc\" (UniqueName: \"kubernetes.io/projected/f232e407-2c28-42d7-a681-9ec185068eaa-kube-api-access-fk6xc\") pod \"crc-debug-5fq88\" (UID: \"f232e407-2c28-42d7-a681-9ec185068eaa\") " pod="openshift-must-gather-m8n6h/crc-debug-5fq88" Mar 18 14:57:05 crc kubenswrapper[4756]: I0318 14:57:05.187255 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/crc-debug-5fq88" Mar 18 14:57:05 crc kubenswrapper[4756]: W0318 14:57:05.217987 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf232e407_2c28_42d7_a681_9ec185068eaa.slice/crio-7aaa05ff077c412b96a47f29605b63e09142a0d6286d757765bc56088c71e97f WatchSource:0}: Error finding container 7aaa05ff077c412b96a47f29605b63e09142a0d6286d757765bc56088c71e97f: Status 404 returned error can't find the container with id 7aaa05ff077c412b96a47f29605b63e09142a0d6286d757765bc56088c71e97f Mar 18 14:57:05 crc kubenswrapper[4756]: I0318 14:57:05.359195 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2539c4f7-fc16-442b-8f7b-54e4fa71470c" path="/var/lib/kubelet/pods/2539c4f7-fc16-442b-8f7b-54e4fa71470c/volumes" Mar 18 14:57:05 crc kubenswrapper[4756]: I0318 14:57:05.380803 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wjdgp" podUID="10846cc4-9288-4f06-a0d7-2cbf1108171d" containerName="registry-server" containerID="cri-o://5ddcf97201de03a711b12601671e3cacd6df0652c1a3cb760d1fd8f9b24a99fd" gracePeriod=2 Mar 18 14:57:05 crc kubenswrapper[4756]: I0318 14:57:05.381199 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m8n6h/crc-debug-5fq88" event={"ID":"f232e407-2c28-42d7-a681-9ec185068eaa","Type":"ContainerStarted","Data":"7aaa05ff077c412b96a47f29605b63e09142a0d6286d757765bc56088c71e97f"} Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.393155 4756 generic.go:334] "Generic (PLEG): container finished" podID="f232e407-2c28-42d7-a681-9ec185068eaa" containerID="3636ed4ef77bf1ae76d810ed4f925bb6b7d09b0c9da0b5263437f979b73af531" exitCode=0 Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.393255 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m8n6h/crc-debug-5fq88" event={"ID":"f232e407-2c28-42d7-a681-9ec185068eaa","Type":"ContainerDied","Data":"3636ed4ef77bf1ae76d810ed4f925bb6b7d09b0c9da0b5263437f979b73af531"} Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.395373 4756 generic.go:334] "Generic (PLEG): container finished" podID="10846cc4-9288-4f06-a0d7-2cbf1108171d" containerID="5ddcf97201de03a711b12601671e3cacd6df0652c1a3cb760d1fd8f9b24a99fd" exitCode=0 Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.395416 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjdgp" event={"ID":"10846cc4-9288-4f06-a0d7-2cbf1108171d","Type":"ContainerDied","Data":"5ddcf97201de03a711b12601671e3cacd6df0652c1a3cb760d1fd8f9b24a99fd"} Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.594392 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.725816 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10846cc4-9288-4f06-a0d7-2cbf1108171d-catalog-content\") pod \"10846cc4-9288-4f06-a0d7-2cbf1108171d\" (UID: \"10846cc4-9288-4f06-a0d7-2cbf1108171d\") " Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.725986 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10846cc4-9288-4f06-a0d7-2cbf1108171d-utilities\") pod \"10846cc4-9288-4f06-a0d7-2cbf1108171d\" (UID: \"10846cc4-9288-4f06-a0d7-2cbf1108171d\") " Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.726009 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgplx\" (UniqueName: \"kubernetes.io/projected/10846cc4-9288-4f06-a0d7-2cbf1108171d-kube-api-access-hgplx\") pod \"10846cc4-9288-4f06-a0d7-2cbf1108171d\" (UID: \"10846cc4-9288-4f06-a0d7-2cbf1108171d\") " Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.728100 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10846cc4-9288-4f06-a0d7-2cbf1108171d-utilities" (OuterVolumeSpecName: "utilities") pod "10846cc4-9288-4f06-a0d7-2cbf1108171d" (UID: "10846cc4-9288-4f06-a0d7-2cbf1108171d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.744389 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10846cc4-9288-4f06-a0d7-2cbf1108171d-kube-api-access-hgplx" (OuterVolumeSpecName: "kube-api-access-hgplx") pod "10846cc4-9288-4f06-a0d7-2cbf1108171d" (UID: "10846cc4-9288-4f06-a0d7-2cbf1108171d"). InnerVolumeSpecName "kube-api-access-hgplx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.811669 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10846cc4-9288-4f06-a0d7-2cbf1108171d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10846cc4-9288-4f06-a0d7-2cbf1108171d" (UID: "10846cc4-9288-4f06-a0d7-2cbf1108171d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.830619 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10846cc4-9288-4f06-a0d7-2cbf1108171d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.830657 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10846cc4-9288-4f06-a0d7-2cbf1108171d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:57:06 crc kubenswrapper[4756]: I0318 14:57:06.830669 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgplx\" (UniqueName: \"kubernetes.io/projected/10846cc4-9288-4f06-a0d7-2cbf1108171d-kube-api-access-hgplx\") on node \"crc\" DevicePath \"\"" Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.415781 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wjdgp" Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.416132 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjdgp" event={"ID":"10846cc4-9288-4f06-a0d7-2cbf1108171d","Type":"ContainerDied","Data":"e6221008d60e70b5b7a485eb1a739bf345fc5d8e883fb6fbae22e32c257dd5f6"} Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.416185 4756 scope.go:117] "RemoveContainer" containerID="5ddcf97201de03a711b12601671e3cacd6df0652c1a3cb760d1fd8f9b24a99fd" Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.451917 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wjdgp"] Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.469091 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wjdgp"] Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.522794 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/crc-debug-5fq88" Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.540865 4756 scope.go:117] "RemoveContainer" containerID="27b69a979794adbe19d5ce8a4c388923baf62dce181589542e58cb196db4ca20" Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.540979 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m8n6h/crc-debug-5fq88"] Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.557694 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m8n6h/crc-debug-5fq88"] Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.571615 4756 scope.go:117] "RemoveContainer" containerID="b58d76af1e4a8b3b3a281e5a2923ce6af1cafe24d8c8a678125d9aa37b0fe35a" Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.646651 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f232e407-2c28-42d7-a681-9ec185068eaa-host\") pod \"f232e407-2c28-42d7-a681-9ec185068eaa\" (UID: \"f232e407-2c28-42d7-a681-9ec185068eaa\") " Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.646708 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk6xc\" (UniqueName: \"kubernetes.io/projected/f232e407-2c28-42d7-a681-9ec185068eaa-kube-api-access-fk6xc\") pod \"f232e407-2c28-42d7-a681-9ec185068eaa\" (UID: \"f232e407-2c28-42d7-a681-9ec185068eaa\") " Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.647007 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f232e407-2c28-42d7-a681-9ec185068eaa-host" (OuterVolumeSpecName: "host") pod "f232e407-2c28-42d7-a681-9ec185068eaa" (UID: "f232e407-2c28-42d7-a681-9ec185068eaa"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.647520 4756 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f232e407-2c28-42d7-a681-9ec185068eaa-host\") on node \"crc\" DevicePath \"\"" Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.671086 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f232e407-2c28-42d7-a681-9ec185068eaa-kube-api-access-fk6xc" (OuterVolumeSpecName: "kube-api-access-fk6xc") pod "f232e407-2c28-42d7-a681-9ec185068eaa" (UID: "f232e407-2c28-42d7-a681-9ec185068eaa"). InnerVolumeSpecName "kube-api-access-fk6xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:57:07 crc kubenswrapper[4756]: I0318 14:57:07.749809 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk6xc\" (UniqueName: \"kubernetes.io/projected/f232e407-2c28-42d7-a681-9ec185068eaa-kube-api-access-fk6xc\") on node \"crc\" DevicePath \"\"" Mar 18 14:57:08 crc kubenswrapper[4756]: I0318 14:57:08.427456 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aaa05ff077c412b96a47f29605b63e09142a0d6286d757765bc56088c71e97f" Mar 18 14:57:08 crc kubenswrapper[4756]: I0318 14:57:08.427478 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/crc-debug-5fq88" Mar 18 14:57:08 crc kubenswrapper[4756]: I0318 14:57:08.978115 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m8n6h/crc-debug-zj4wb"] Mar 18 14:57:08 crc kubenswrapper[4756]: E0318 14:57:08.978969 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10846cc4-9288-4f06-a0d7-2cbf1108171d" containerName="registry-server" Mar 18 14:57:08 crc kubenswrapper[4756]: I0318 14:57:08.979052 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="10846cc4-9288-4f06-a0d7-2cbf1108171d" containerName="registry-server" Mar 18 14:57:08 crc kubenswrapper[4756]: E0318 14:57:08.979147 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f232e407-2c28-42d7-a681-9ec185068eaa" containerName="container-00" Mar 18 14:57:08 crc kubenswrapper[4756]: I0318 14:57:08.979201 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f232e407-2c28-42d7-a681-9ec185068eaa" containerName="container-00" Mar 18 14:57:08 crc kubenswrapper[4756]: E0318 14:57:08.979286 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10846cc4-9288-4f06-a0d7-2cbf1108171d" containerName="extract-utilities" Mar 18 14:57:08 crc kubenswrapper[4756]: I0318 14:57:08.979343 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="10846cc4-9288-4f06-a0d7-2cbf1108171d" containerName="extract-utilities" Mar 18 14:57:08 crc kubenswrapper[4756]: E0318 14:57:08.979398 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10846cc4-9288-4f06-a0d7-2cbf1108171d" containerName="extract-content" Mar 18 14:57:08 crc kubenswrapper[4756]: I0318 14:57:08.979445 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="10846cc4-9288-4f06-a0d7-2cbf1108171d" containerName="extract-content" Mar 18 14:57:08 crc kubenswrapper[4756]: I0318 14:57:08.979694 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="10846cc4-9288-4f06-a0d7-2cbf1108171d" containerName="registry-server" Mar 18 14:57:08 crc kubenswrapper[4756]: I0318 14:57:08.979767 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f232e407-2c28-42d7-a681-9ec185068eaa" containerName="container-00" Mar 18 14:57:08 crc kubenswrapper[4756]: I0318 14:57:08.980726 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/crc-debug-zj4wb" Mar 18 14:57:08 crc kubenswrapper[4756]: I0318 14:57:08.982324 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-m8n6h"/"default-dockercfg-l8xhl" Mar 18 14:57:09 crc kubenswrapper[4756]: I0318 14:57:09.079718 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c46695c-ccbe-405c-9cae-c64d49f4d557-host\") pod \"crc-debug-zj4wb\" (UID: \"4c46695c-ccbe-405c-9cae-c64d49f4d557\") " pod="openshift-must-gather-m8n6h/crc-debug-zj4wb" Mar 18 14:57:09 crc kubenswrapper[4756]: I0318 14:57:09.079823 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t97zz\" (UniqueName: \"kubernetes.io/projected/4c46695c-ccbe-405c-9cae-c64d49f4d557-kube-api-access-t97zz\") pod \"crc-debug-zj4wb\" (UID: \"4c46695c-ccbe-405c-9cae-c64d49f4d557\") " pod="openshift-must-gather-m8n6h/crc-debug-zj4wb" Mar 18 14:57:09 crc kubenswrapper[4756]: I0318 14:57:09.186226 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t97zz\" (UniqueName: \"kubernetes.io/projected/4c46695c-ccbe-405c-9cae-c64d49f4d557-kube-api-access-t97zz\") pod \"crc-debug-zj4wb\" (UID: \"4c46695c-ccbe-405c-9cae-c64d49f4d557\") " pod="openshift-must-gather-m8n6h/crc-debug-zj4wb" Mar 18 14:57:09 crc kubenswrapper[4756]: I0318 14:57:09.186610 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c46695c-ccbe-405c-9cae-c64d49f4d557-host\") pod \"crc-debug-zj4wb\" (UID: \"4c46695c-ccbe-405c-9cae-c64d49f4d557\") " pod="openshift-must-gather-m8n6h/crc-debug-zj4wb" Mar 18 14:57:09 crc kubenswrapper[4756]: I0318 14:57:09.186801 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c46695c-ccbe-405c-9cae-c64d49f4d557-host\") pod \"crc-debug-zj4wb\" (UID: \"4c46695c-ccbe-405c-9cae-c64d49f4d557\") " pod="openshift-must-gather-m8n6h/crc-debug-zj4wb" Mar 18 14:57:09 crc kubenswrapper[4756]: I0318 14:57:09.211850 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t97zz\" (UniqueName: \"kubernetes.io/projected/4c46695c-ccbe-405c-9cae-c64d49f4d557-kube-api-access-t97zz\") pod \"crc-debug-zj4wb\" (UID: \"4c46695c-ccbe-405c-9cae-c64d49f4d557\") " pod="openshift-must-gather-m8n6h/crc-debug-zj4wb" Mar 18 14:57:09 crc kubenswrapper[4756]: I0318 14:57:09.297378 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/crc-debug-zj4wb" Mar 18 14:57:09 crc kubenswrapper[4756]: I0318 14:57:09.336553 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10846cc4-9288-4f06-a0d7-2cbf1108171d" path="/var/lib/kubelet/pods/10846cc4-9288-4f06-a0d7-2cbf1108171d/volumes" Mar 18 14:57:09 crc kubenswrapper[4756]: I0318 14:57:09.337290 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f232e407-2c28-42d7-a681-9ec185068eaa" path="/var/lib/kubelet/pods/f232e407-2c28-42d7-a681-9ec185068eaa/volumes" Mar 18 14:57:09 crc kubenswrapper[4756]: I0318 14:57:09.454888 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m8n6h/crc-debug-zj4wb" event={"ID":"4c46695c-ccbe-405c-9cae-c64d49f4d557","Type":"ContainerStarted","Data":"45fc17830b577583299df3db83425955735066ef1813f0700654828d6b9c4532"} Mar 18 14:57:10 crc kubenswrapper[4756]: I0318 14:57:10.464897 4756 generic.go:334] "Generic (PLEG): container finished" podID="4c46695c-ccbe-405c-9cae-c64d49f4d557" containerID="329bd9e45673e138f7129f3b8d3a2f045461b4d959e21c3a93ce6c2e20c35abc" exitCode=0 Mar 18 14:57:10 crc kubenswrapper[4756]: I0318 14:57:10.465098 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m8n6h/crc-debug-zj4wb" event={"ID":"4c46695c-ccbe-405c-9cae-c64d49f4d557","Type":"ContainerDied","Data":"329bd9e45673e138f7129f3b8d3a2f045461b4d959e21c3a93ce6c2e20c35abc"} Mar 18 14:57:10 crc kubenswrapper[4756]: I0318 14:57:10.501689 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m8n6h/crc-debug-zj4wb"] Mar 18 14:57:10 crc kubenswrapper[4756]: I0318 14:57:10.509692 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m8n6h/crc-debug-zj4wb"] Mar 18 14:57:11 crc kubenswrapper[4756]: I0318 14:57:11.315864 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:57:11 crc kubenswrapper[4756]: E0318 14:57:11.316321 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:57:11 crc kubenswrapper[4756]: I0318 14:57:11.606223 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/crc-debug-zj4wb" Mar 18 14:57:11 crc kubenswrapper[4756]: I0318 14:57:11.736431 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c46695c-ccbe-405c-9cae-c64d49f4d557-host\") pod \"4c46695c-ccbe-405c-9cae-c64d49f4d557\" (UID: \"4c46695c-ccbe-405c-9cae-c64d49f4d557\") " Mar 18 14:57:11 crc kubenswrapper[4756]: I0318 14:57:11.736509 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c46695c-ccbe-405c-9cae-c64d49f4d557-host" (OuterVolumeSpecName: "host") pod "4c46695c-ccbe-405c-9cae-c64d49f4d557" (UID: "4c46695c-ccbe-405c-9cae-c64d49f4d557"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:57:11 crc kubenswrapper[4756]: I0318 14:57:11.736638 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t97zz\" (UniqueName: \"kubernetes.io/projected/4c46695c-ccbe-405c-9cae-c64d49f4d557-kube-api-access-t97zz\") pod \"4c46695c-ccbe-405c-9cae-c64d49f4d557\" (UID: \"4c46695c-ccbe-405c-9cae-c64d49f4d557\") " Mar 18 14:57:11 crc kubenswrapper[4756]: I0318 14:57:11.737215 4756 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c46695c-ccbe-405c-9cae-c64d49f4d557-host\") on node \"crc\" DevicePath \"\"" Mar 18 14:57:11 crc kubenswrapper[4756]: I0318 14:57:11.741576 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c46695c-ccbe-405c-9cae-c64d49f4d557-kube-api-access-t97zz" (OuterVolumeSpecName: "kube-api-access-t97zz") pod "4c46695c-ccbe-405c-9cae-c64d49f4d557" (UID: "4c46695c-ccbe-405c-9cae-c64d49f4d557"). InnerVolumeSpecName "kube-api-access-t97zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:57:11 crc kubenswrapper[4756]: I0318 14:57:11.838537 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t97zz\" (UniqueName: \"kubernetes.io/projected/4c46695c-ccbe-405c-9cae-c64d49f4d557-kube-api-access-t97zz\") on node \"crc\" DevicePath \"\"" Mar 18 14:57:12 crc kubenswrapper[4756]: I0318 14:57:12.483570 4756 scope.go:117] "RemoveContainer" containerID="329bd9e45673e138f7129f3b8d3a2f045461b4d959e21c3a93ce6c2e20c35abc" Mar 18 14:57:12 crc kubenswrapper[4756]: I0318 14:57:12.483623 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/crc-debug-zj4wb" Mar 18 14:57:12 crc kubenswrapper[4756]: I0318 14:57:12.666469 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rs8s7" podUID="abab2751-a427-4e83-a7ef-7393eed94c71" containerName="registry-server" probeResult="failure" output=< Mar 18 14:57:12 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:57:12 crc kubenswrapper[4756]: > Mar 18 14:57:13 crc kubenswrapper[4756]: I0318 14:57:13.325098 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c46695c-ccbe-405c-9cae-c64d49f4d557" path="/var/lib/kubelet/pods/4c46695c-ccbe-405c-9cae-c64d49f4d557/volumes" Mar 18 14:57:22 crc kubenswrapper[4756]: I0318 14:57:22.693552 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rs8s7" podUID="abab2751-a427-4e83-a7ef-7393eed94c71" containerName="registry-server" probeResult="failure" output=< Mar 18 14:57:22 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 14:57:22 crc kubenswrapper[4756]: > Mar 18 14:57:25 crc kubenswrapper[4756]: I0318 14:57:25.317155 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:57:25 crc kubenswrapper[4756]: E0318 14:57:25.317943 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:57:31 crc kubenswrapper[4756]: I0318 14:57:31.676819 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:57:31 crc kubenswrapper[4756]: I0318 14:57:31.754044 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:57:32 crc kubenswrapper[4756]: I0318 14:57:32.490497 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rs8s7"] Mar 18 14:57:33 crc kubenswrapper[4756]: I0318 14:57:33.665500 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rs8s7" podUID="abab2751-a427-4e83-a7ef-7393eed94c71" containerName="registry-server" containerID="cri-o://4da3470224078e85fc26dc0700f936f6d0b1cf6df78d569e43ddd15c296579da" gracePeriod=2 Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.619952 4756 scope.go:117] "RemoveContainer" containerID="643df94366409f508c501b1f3a4af81709c063ced3e1df4e6b1e6a84c6e33dfe" Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.675213 4756 generic.go:334] "Generic (PLEG): container finished" podID="abab2751-a427-4e83-a7ef-7393eed94c71" containerID="4da3470224078e85fc26dc0700f936f6d0b1cf6df78d569e43ddd15c296579da" exitCode=0 Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.675284 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs8s7" event={"ID":"abab2751-a427-4e83-a7ef-7393eed94c71","Type":"ContainerDied","Data":"4da3470224078e85fc26dc0700f936f6d0b1cf6df78d569e43ddd15c296579da"} Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.675335 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs8s7" event={"ID":"abab2751-a427-4e83-a7ef-7393eed94c71","Type":"ContainerDied","Data":"c76ff0404e4e0356eb534558ca77093e1c0b62de543345f0cc6a6e8120a49b41"} Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.675348 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c76ff0404e4e0356eb534558ca77093e1c0b62de543345f0cc6a6e8120a49b41" Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.716370 4756 scope.go:117] "RemoveContainer" containerID="90d032c58dfe481b577d0cf654f7dd86d1debd3bbdf5056608379ca4d3b9dfb7" Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.746692 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.771897 4756 scope.go:117] "RemoveContainer" containerID="1160dca893e205a7e1ae07baf6179a8d829e8f9981191175936b2b14882f3492" Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.812973 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwdjt\" (UniqueName: \"kubernetes.io/projected/abab2751-a427-4e83-a7ef-7393eed94c71-kube-api-access-dwdjt\") pod \"abab2751-a427-4e83-a7ef-7393eed94c71\" (UID: \"abab2751-a427-4e83-a7ef-7393eed94c71\") " Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.813033 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abab2751-a427-4e83-a7ef-7393eed94c71-utilities\") pod \"abab2751-a427-4e83-a7ef-7393eed94c71\" (UID: \"abab2751-a427-4e83-a7ef-7393eed94c71\") " Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.813074 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abab2751-a427-4e83-a7ef-7393eed94c71-catalog-content\") pod \"abab2751-a427-4e83-a7ef-7393eed94c71\" (UID: \"abab2751-a427-4e83-a7ef-7393eed94c71\") " Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.813876 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abab2751-a427-4e83-a7ef-7393eed94c71-utilities" (OuterVolumeSpecName: "utilities") pod "abab2751-a427-4e83-a7ef-7393eed94c71" (UID: "abab2751-a427-4e83-a7ef-7393eed94c71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.821634 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abab2751-a427-4e83-a7ef-7393eed94c71-kube-api-access-dwdjt" (OuterVolumeSpecName: "kube-api-access-dwdjt") pod "abab2751-a427-4e83-a7ef-7393eed94c71" (UID: "abab2751-a427-4e83-a7ef-7393eed94c71"). InnerVolumeSpecName "kube-api-access-dwdjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.915936 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwdjt\" (UniqueName: \"kubernetes.io/projected/abab2751-a427-4e83-a7ef-7393eed94c71-kube-api-access-dwdjt\") on node \"crc\" DevicePath \"\"" Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.915967 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abab2751-a427-4e83-a7ef-7393eed94c71-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:57:34 crc kubenswrapper[4756]: I0318 14:57:34.956819 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abab2751-a427-4e83-a7ef-7393eed94c71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abab2751-a427-4e83-a7ef-7393eed94c71" (UID: "abab2751-a427-4e83-a7ef-7393eed94c71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:57:35 crc kubenswrapper[4756]: I0318 14:57:35.018035 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abab2751-a427-4e83-a7ef-7393eed94c71-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:57:35 crc kubenswrapper[4756]: I0318 14:57:35.684604 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rs8s7" Mar 18 14:57:35 crc kubenswrapper[4756]: I0318 14:57:35.708527 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rs8s7"] Mar 18 14:57:35 crc kubenswrapper[4756]: I0318 14:57:35.725101 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rs8s7"] Mar 18 14:57:37 crc kubenswrapper[4756]: I0318 14:57:37.327628 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abab2751-a427-4e83-a7ef-7393eed94c71" path="/var/lib/kubelet/pods/abab2751-a427-4e83-a7ef-7393eed94c71/volumes" Mar 18 14:57:39 crc kubenswrapper[4756]: I0318 14:57:39.322485 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:57:39 crc kubenswrapper[4756]: E0318 14:57:39.323460 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:57:49 crc kubenswrapper[4756]: I0318 14:57:49.105508 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_529ae791-8631-4ca5-9e4b-bb857d6264a8/init-config-reloader/0.log" Mar 18 14:57:49 crc kubenswrapper[4756]: I0318 14:57:49.568141 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_529ae791-8631-4ca5-9e4b-bb857d6264a8/alertmanager/0.log" Mar 18 14:57:49 crc kubenswrapper[4756]: I0318 14:57:49.622787 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_529ae791-8631-4ca5-9e4b-bb857d6264a8/config-reloader/0.log" Mar 18 14:57:49 crc kubenswrapper[4756]: I0318 14:57:49.683314 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_529ae791-8631-4ca5-9e4b-bb857d6264a8/init-config-reloader/0.log" Mar 18 14:57:49 crc kubenswrapper[4756]: I0318 14:57:49.887254 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-58c774f67d-hdzcx_ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f/barbican-api/0.log" Mar 18 14:57:50 crc kubenswrapper[4756]: I0318 14:57:50.054296 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-58c774f67d-hdzcx_ddfe8af8-356d-4e42-ade9-26a1d1a8cb4f/barbican-api-log/0.log" Mar 18 14:57:50 crc kubenswrapper[4756]: I0318 14:57:50.221879 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7995d6cd86-6kx6b_7ad060be-35eb-4d9d-8a45-0a387009708c/barbican-keystone-listener/0.log" Mar 18 14:57:50 crc kubenswrapper[4756]: I0318 14:57:50.316211 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:57:50 crc kubenswrapper[4756]: E0318 14:57:50.316541 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:57:50 crc kubenswrapper[4756]: I0318 14:57:50.419272 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6ccf458dc-bmbzj_8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5/barbican-worker/0.log" Mar 18 14:57:50 crc kubenswrapper[4756]: I0318 14:57:50.498008 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7995d6cd86-6kx6b_7ad060be-35eb-4d9d-8a45-0a387009708c/barbican-keystone-listener-log/0.log" Mar 18 14:57:50 crc kubenswrapper[4756]: I0318 14:57:50.543294 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6ccf458dc-bmbzj_8bd27aea-16a7-4a7b-ad8f-cfd43ba3ece5/barbican-worker-log/0.log" Mar 18 14:57:50 crc kubenswrapper[4756]: I0318 14:57:50.875970 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-hmffq_06faf283-4cbe-459f-81b6-ca3f598ae5b0/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:57:51 crc kubenswrapper[4756]: I0318 14:57:51.117534 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_14a36d4c-c545-467d-a0a8-0aa38de63eb1/ceilometer-central-agent/0.log" Mar 18 14:57:51 crc kubenswrapper[4756]: I0318 14:57:51.278030 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_14a36d4c-c545-467d-a0a8-0aa38de63eb1/sg-core/0.log" Mar 18 14:57:51 crc kubenswrapper[4756]: I0318 14:57:51.297805 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_14a36d4c-c545-467d-a0a8-0aa38de63eb1/proxy-httpd/0.log" Mar 18 14:57:51 crc kubenswrapper[4756]: I0318 14:57:51.322291 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_14a36d4c-c545-467d-a0a8-0aa38de63eb1/ceilometer-notification-agent/0.log" Mar 18 14:57:51 crc kubenswrapper[4756]: I0318 14:57:51.698337 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_506de922-637d-4174-aeaa-236a27140466/cinder-api/0.log" Mar 18 14:57:51 crc kubenswrapper[4756]: I0318 14:57:51.789621 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_506de922-637d-4174-aeaa-236a27140466/cinder-api-log/0.log" Mar 18 14:57:52 crc kubenswrapper[4756]: I0318 14:57:52.216026 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_4ffb8a9a-4e52-4db8-a22f-994a2cf222cf/probe/0.log" Mar 18 14:57:52 crc kubenswrapper[4756]: I0318 14:57:52.299568 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_4ffb8a9a-4e52-4db8-a22f-994a2cf222cf/cinder-scheduler/0.log" Mar 18 14:57:52 crc kubenswrapper[4756]: I0318 14:57:52.433935 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_518ce4af-ca96-475d-9514-f7304f6a2498/cloudkitty-api/0.log" Mar 18 14:57:52 crc kubenswrapper[4756]: I0318 14:57:52.566225 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_518ce4af-ca96-475d-9514-f7304f6a2498/cloudkitty-api-log/0.log" Mar 18 14:57:52 crc kubenswrapper[4756]: I0318 14:57:52.715334 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_06157d4c-39c0-4895-b002-79ee7b960512/loki-compactor/0.log" Mar 18 14:57:52 crc kubenswrapper[4756]: I0318 14:57:52.998394 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-5d547bbd4d-b2tdd_e7acb694-7937-45a7-8aab-c2175fae6423/loki-distributor/0.log" Mar 18 14:57:53 crc kubenswrapper[4756]: I0318 14:57:53.054862 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-6b884dc4b5-8pq9g_a3a9e126-317d-4f52-a4f8-e657dfa9930c/gateway/0.log" Mar 18 14:57:53 crc kubenswrapper[4756]: I0318 14:57:53.612932 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-6b884dc4b5-p44r5_0e90a533-9bd6-4f94-a8c4-52218f2919b0/gateway/0.log" Mar 18 14:57:53 crc kubenswrapper[4756]: I0318 14:57:53.701933 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_3528f495-dffb-47d3-99fe-69054008e8cd/loki-index-gateway/0.log" Mar 18 14:57:54 crc kubenswrapper[4756]: I0318 14:57:54.388248 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_ca8fb9b6-a1a9-4781-af8f-2e7e78e62771/loki-ingester/0.log" Mar 18 14:57:54 crc kubenswrapper[4756]: I0318 14:57:54.497586 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-668f98fdd7-sv5g4_7edadc4c-acee-49a2-b629-c4505d40eebc/loki-querier/0.log" Mar 18 14:57:54 crc kubenswrapper[4756]: I0318 14:57:54.574277 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-6f54889599-4pgf5_4999c7cd-7963-42e4-8404-a0203664d331/loki-query-frontend/0.log" Mar 18 14:57:55 crc kubenswrapper[4756]: I0318 14:57:55.366628 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-l5b7z_437705be-47a9-4902-9b5d-c8f293a3985e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:57:55 crc kubenswrapper[4756]: I0318 14:57:55.428131 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-s2hzd_dd7799ac-8443-42eb-ab9e-47e654eb8dca/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:57:55 crc kubenswrapper[4756]: I0318 14:57:55.628170 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-9qbkr_8f23b254-5d70-4ac0-948f-cfa6974416fd/init/0.log" Mar 18 14:57:56 crc kubenswrapper[4756]: I0318 14:57:56.038973 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-9qbkr_8f23b254-5d70-4ac0-948f-cfa6974416fd/dnsmasq-dns/0.log" Mar 18 14:57:56 crc kubenswrapper[4756]: I0318 14:57:56.055342 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-9qbkr_8f23b254-5d70-4ac0-948f-cfa6974416fd/init/0.log" Mar 18 14:57:56 crc kubenswrapper[4756]: I0318 14:57:56.265341 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-xqnmr_570b883b-276f-43d9-983b-3f99763e7e4d/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:57:56 crc kubenswrapper[4756]: I0318 14:57:56.508430 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_90742574-07ba-4265-aa05-59c9f557caf0/glance-httpd/0.log" Mar 18 14:57:56 crc kubenswrapper[4756]: I0318 14:57:56.663605 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_90742574-07ba-4265-aa05-59c9f557caf0/glance-log/0.log" Mar 18 14:57:56 crc kubenswrapper[4756]: I0318 14:57:56.836725 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_8e468a87-954b-4269-b75f-e8aed6cc63aa/cloudkitty-proc/0.log" Mar 18 14:57:56 crc kubenswrapper[4756]: I0318 14:57:56.843714 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2d168f1b-0779-47a8-8346-254ecfc9a126/glance-httpd/0.log" Mar 18 14:57:56 crc kubenswrapper[4756]: I0318 14:57:56.931616 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2d168f1b-0779-47a8-8346-254ecfc9a126/glance-log/0.log" Mar 18 14:57:57 crc kubenswrapper[4756]: I0318 14:57:57.118329 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-dtpdm_eda86959-5a56-443e-b21e-9d0dcd73e6b6/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:57:57 crc kubenswrapper[4756]: I0318 14:57:57.175518 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-znvdh_52e4cd39-a658-42c6-b1d3-2f7a144688f1/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:57:57 crc kubenswrapper[4756]: I0318 14:57:57.523136 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7f64d44848-xg692_e0973f28-44a0-4f66-aadf-42187c9ced68/keystone-api/0.log" Mar 18 14:57:57 crc kubenswrapper[4756]: I0318 14:57:57.596928 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7707ba89-0a52-4e4e-bb4d-d381e3663d46/kube-state-metrics/0.log" Mar 18 14:57:57 crc kubenswrapper[4756]: I0318 14:57:57.856545 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xft9f_2a04f637-4e7d-4efc-8fc0-ce511f450960/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:57:58 crc kubenswrapper[4756]: I0318 14:57:58.295656 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-766db9c4f-fbsb4_05701c8c-2d0a-47aa-be93-6e037492fb49/neutron-api/0.log" Mar 18 14:57:58 crc kubenswrapper[4756]: I0318 14:57:58.345429 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-766db9c4f-fbsb4_05701c8c-2d0a-47aa-be93-6e037492fb49/neutron-httpd/0.log" Mar 18 14:57:58 crc kubenswrapper[4756]: I0318 14:57:58.630577 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlv5c_044cc30f-7755-47c5-8b78-84c89ee897bf/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:57:59 crc kubenswrapper[4756]: I0318 14:57:59.248514 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a634960a-3e75-4837-a471-0a228302abe0/nova-api-log/0.log" Mar 18 14:57:59 crc kubenswrapper[4756]: I0318 14:57:59.331901 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a634960a-3e75-4837-a471-0a228302abe0/nova-api-api/0.log" Mar 18 14:57:59 crc kubenswrapper[4756]: I0318 14:57:59.371002 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_eedb3d11-4f6e-4329-8f2e-c1c56b44a3e3/nova-cell0-conductor-conductor/0.log" Mar 18 14:57:59 crc kubenswrapper[4756]: I0318 14:57:59.697357 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8bf60e6c-1de0-4b66-84fe-635d0d235bad/nova-cell1-conductor-conductor/0.log" Mar 18 14:57:59 crc kubenswrapper[4756]: I0318 14:57:59.882457 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e08e6109-a0d3-4f6a-a790-c2e1725f635c/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.169977 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564098-hrrsk"] Mar 18 14:58:00 crc kubenswrapper[4756]: E0318 14:58:00.170426 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abab2751-a427-4e83-a7ef-7393eed94c71" containerName="extract-content" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.170444 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="abab2751-a427-4e83-a7ef-7393eed94c71" containerName="extract-content" Mar 18 14:58:00 crc kubenswrapper[4756]: E0318 14:58:00.170458 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abab2751-a427-4e83-a7ef-7393eed94c71" containerName="registry-server" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.170464 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="abab2751-a427-4e83-a7ef-7393eed94c71" containerName="registry-server" Mar 18 14:58:00 crc kubenswrapper[4756]: E0318 14:58:00.170494 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abab2751-a427-4e83-a7ef-7393eed94c71" containerName="extract-utilities" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.170501 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="abab2751-a427-4e83-a7ef-7393eed94c71" containerName="extract-utilities" Mar 18 14:58:00 crc kubenswrapper[4756]: E0318 14:58:00.170512 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c46695c-ccbe-405c-9cae-c64d49f4d557" containerName="container-00" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.170518 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c46695c-ccbe-405c-9cae-c64d49f4d557" containerName="container-00" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.170704 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="abab2751-a427-4e83-a7ef-7393eed94c71" containerName="registry-server" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.170715 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c46695c-ccbe-405c-9cae-c64d49f4d557" containerName="container-00" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.171460 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564098-hrrsk" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.174132 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.174386 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.175424 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.183952 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564098-hrrsk"] Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.266784 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtqzq\" (UniqueName: \"kubernetes.io/projected/51e5d910-0d11-4baa-987f-ffe6ba4e5e99-kube-api-access-qtqzq\") pod \"auto-csr-approver-29564098-hrrsk\" (UID: \"51e5d910-0d11-4baa-987f-ffe6ba4e5e99\") " pod="openshift-infra/auto-csr-approver-29564098-hrrsk" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.272061 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-nvw5h_ddf30b37-8904-4a9f-8b73-8afe413c778b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.371187 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtqzq\" (UniqueName: \"kubernetes.io/projected/51e5d910-0d11-4baa-987f-ffe6ba4e5e99-kube-api-access-qtqzq\") pod \"auto-csr-approver-29564098-hrrsk\" (UID: \"51e5d910-0d11-4baa-987f-ffe6ba4e5e99\") " pod="openshift-infra/auto-csr-approver-29564098-hrrsk" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.398213 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtqzq\" (UniqueName: \"kubernetes.io/projected/51e5d910-0d11-4baa-987f-ffe6ba4e5e99-kube-api-access-qtqzq\") pod \"auto-csr-approver-29564098-hrrsk\" (UID: \"51e5d910-0d11-4baa-987f-ffe6ba4e5e99\") " pod="openshift-infra/auto-csr-approver-29564098-hrrsk" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.502978 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564098-hrrsk" Mar 18 14:58:00 crc kubenswrapper[4756]: I0318 14:58:00.712101 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2895ca94-33ed-4aa0-bd42-af3b10592ae4/nova-metadata-log/0.log" Mar 18 14:58:01 crc kubenswrapper[4756]: I0318 14:58:01.018719 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2895ca94-33ed-4aa0-bd42-af3b10592ae4/nova-metadata-metadata/0.log" Mar 18 14:58:01 crc kubenswrapper[4756]: I0318 14:58:01.136736 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d87ac058-1e5e-4aa6-801d-a7a92e65d112/nova-scheduler-scheduler/0.log" Mar 18 14:58:01 crc kubenswrapper[4756]: I0318 14:58:01.372758 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564098-hrrsk"] Mar 18 14:58:01 crc kubenswrapper[4756]: I0318 14:58:01.404461 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b5d3bcfe-0ae1-4104-8433-ffb4569a29d8/mysql-bootstrap/0.log" Mar 18 14:58:01 crc kubenswrapper[4756]: I0318 14:58:01.750952 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b5d3bcfe-0ae1-4104-8433-ffb4569a29d8/mysql-bootstrap/0.log" Mar 18 14:58:01 crc kubenswrapper[4756]: I0318 14:58:01.798686 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe/mysql-bootstrap/0.log" Mar 18 14:58:01 crc kubenswrapper[4756]: I0318 14:58:01.875334 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b5d3bcfe-0ae1-4104-8433-ffb4569a29d8/galera/0.log" Mar 18 14:58:01 crc kubenswrapper[4756]: I0318 14:58:01.959246 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564098-hrrsk" event={"ID":"51e5d910-0d11-4baa-987f-ffe6ba4e5e99","Type":"ContainerStarted","Data":"9c7af74137167d27bc2d7c56cdff953a9eddf886c8a1fd303e2977de87eb4b4e"} Mar 18 14:58:02 crc kubenswrapper[4756]: I0318 14:58:02.225872 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe/galera/0.log" Mar 18 14:58:02 crc kubenswrapper[4756]: I0318 14:58:02.232722 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_26157f10-41ab-4c9e-836d-136febf288cd/openstackclient/0.log" Mar 18 14:58:02 crc kubenswrapper[4756]: I0318 14:58:02.298462 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7bbb1faa-dcc0-4dcd-bf97-9c2b1904a9fe/mysql-bootstrap/0.log" Mar 18 14:58:02 crc kubenswrapper[4756]: I0318 14:58:02.538645 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-gkrbd_ac7f17bf-2987-447a-a61b-c0b97615ced5/openstack-network-exporter/0.log" Mar 18 14:58:02 crc kubenswrapper[4756]: I0318 14:58:02.875234 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vsj9n_3f5c347c-244f-40b6-8311-8eac0e22626a/ovsdb-server-init/0.log" Mar 18 14:58:03 crc kubenswrapper[4756]: I0318 14:58:03.315867 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:58:03 crc kubenswrapper[4756]: E0318 14:58:03.316704 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:58:03 crc kubenswrapper[4756]: I0318 14:58:03.372302 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vsj9n_3f5c347c-244f-40b6-8311-8eac0e22626a/ovsdb-server-init/0.log" Mar 18 14:58:03 crc kubenswrapper[4756]: I0318 14:58:03.410561 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vsj9n_3f5c347c-244f-40b6-8311-8eac0e22626a/ovs-vswitchd/0.log" Mar 18 14:58:03 crc kubenswrapper[4756]: I0318 14:58:03.536470 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vsj9n_3f5c347c-244f-40b6-8311-8eac0e22626a/ovsdb-server/0.log" Mar 18 14:58:03 crc kubenswrapper[4756]: I0318 14:58:03.738644 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-r6s8c_99dfb896-59f3-4f93-8d0e-4b19b49cbc56/ovn-controller/0.log" Mar 18 14:58:03 crc kubenswrapper[4756]: I0318 14:58:03.935785 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-b68mn_b3d1261b-4146-4fc4-baa5-79ac98704bcd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:58:04 crc kubenswrapper[4756]: I0318 14:58:04.006864 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564098-hrrsk" event={"ID":"51e5d910-0d11-4baa-987f-ffe6ba4e5e99","Type":"ContainerStarted","Data":"f31486a4e7cc44f2f575797f0308b587f93c51618a81d609bc5a8256ab79c483"} Mar 18 14:58:04 crc kubenswrapper[4756]: I0318 14:58:04.035504 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564098-hrrsk" podStartSLOduration=2.167543794 podStartE2EDuration="4.035484898s" podCreationTimestamp="2026-03-18 14:58:00 +0000 UTC" firstStartedPulling="2026-03-18 14:58:01.319320243 +0000 UTC m=+3482.633738208" lastFinishedPulling="2026-03-18 14:58:03.187261337 +0000 UTC m=+3484.501679312" observedRunningTime="2026-03-18 14:58:04.022384803 +0000 UTC m=+3485.336802778" watchObservedRunningTime="2026-03-18 14:58:04.035484898 +0000 UTC m=+3485.349902873" Mar 18 14:58:04 crc kubenswrapper[4756]: I0318 14:58:04.059601 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac919d8c-6a4e-4239-a76c-5cbefcd01ce6/openstack-network-exporter/0.log" Mar 18 14:58:04 crc kubenswrapper[4756]: I0318 14:58:04.216603 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac919d8c-6a4e-4239-a76c-5cbefcd01ce6/ovn-northd/0.log" Mar 18 14:58:04 crc kubenswrapper[4756]: I0318 14:58:04.400157 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ba7aaad4-94c3-4202-a512-a84cba9bcb9f/ovsdbserver-nb/0.log" Mar 18 14:58:04 crc kubenswrapper[4756]: I0318 14:58:04.651972 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ba7aaad4-94c3-4202-a512-a84cba9bcb9f/openstack-network-exporter/0.log" Mar 18 14:58:04 crc kubenswrapper[4756]: I0318 14:58:04.826977 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ee4b1cdc-8b62-42b6-9bc7-61164f90afb4/ovsdbserver-sb/0.log" Mar 18 14:58:04 crc kubenswrapper[4756]: I0318 14:58:04.878545 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ee4b1cdc-8b62-42b6-9bc7-61164f90afb4/openstack-network-exporter/0.log" Mar 18 14:58:05 crc kubenswrapper[4756]: I0318 14:58:05.015925 4756 generic.go:334] "Generic (PLEG): container finished" podID="51e5d910-0d11-4baa-987f-ffe6ba4e5e99" containerID="f31486a4e7cc44f2f575797f0308b587f93c51618a81d609bc5a8256ab79c483" exitCode=0 Mar 18 14:58:05 crc kubenswrapper[4756]: I0318 14:58:05.015966 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564098-hrrsk" event={"ID":"51e5d910-0d11-4baa-987f-ffe6ba4e5e99","Type":"ContainerDied","Data":"f31486a4e7cc44f2f575797f0308b587f93c51618a81d609bc5a8256ab79c483"} Mar 18 14:58:05 crc kubenswrapper[4756]: I0318 14:58:05.035866 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-654d496b7d-zrbn7_fc4b5923-eb57-490b-a642-1a56d8a7b9b7/placement-api/0.log" Mar 18 14:58:05 crc kubenswrapper[4756]: I0318 14:58:05.320245 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-654d496b7d-zrbn7_fc4b5923-eb57-490b-a642-1a56d8a7b9b7/placement-log/0.log" Mar 18 14:58:05 crc kubenswrapper[4756]: I0318 14:58:05.402506 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d3674269-04c7-45df-ad72-38d1bb5aab93/init-config-reloader/0.log" Mar 18 14:58:05 crc kubenswrapper[4756]: I0318 14:58:05.616438 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d3674269-04c7-45df-ad72-38d1bb5aab93/config-reloader/0.log" Mar 18 14:58:05 crc kubenswrapper[4756]: I0318 14:58:05.645984 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d3674269-04c7-45df-ad72-38d1bb5aab93/init-config-reloader/0.log" Mar 18 14:58:05 crc kubenswrapper[4756]: I0318 14:58:05.732632 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d3674269-04c7-45df-ad72-38d1bb5aab93/prometheus/0.log" Mar 18 14:58:05 crc kubenswrapper[4756]: I0318 14:58:05.756317 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d3674269-04c7-45df-ad72-38d1bb5aab93/thanos-sidecar/0.log" Mar 18 14:58:05 crc kubenswrapper[4756]: I0318 14:58:05.994330 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c6dd5f14-94cd-4fee-9798-8c93d27de8b9/setup-container/0.log" Mar 18 14:58:06 crc kubenswrapper[4756]: I0318 14:58:06.417737 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c6dd5f14-94cd-4fee-9798-8c93d27de8b9/setup-container/0.log" Mar 18 14:58:06 crc kubenswrapper[4756]: I0318 14:58:06.536410 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c6dd5f14-94cd-4fee-9798-8c93d27de8b9/rabbitmq/0.log" Mar 18 14:58:06 crc kubenswrapper[4756]: I0318 14:58:06.819602 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cc1f1584-6d11-4821-8a1d-4a58648313e3/setup-container/0.log" Mar 18 14:58:06 crc kubenswrapper[4756]: I0318 14:58:06.938638 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cc1f1584-6d11-4821-8a1d-4a58648313e3/setup-container/0.log" Mar 18 14:58:07 crc kubenswrapper[4756]: I0318 14:58:07.044442 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564098-hrrsk" event={"ID":"51e5d910-0d11-4baa-987f-ffe6ba4e5e99","Type":"ContainerDied","Data":"9c7af74137167d27bc2d7c56cdff953a9eddf886c8a1fd303e2977de87eb4b4e"} Mar 18 14:58:07 crc kubenswrapper[4756]: I0318 14:58:07.044484 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c7af74137167d27bc2d7c56cdff953a9eddf886c8a1fd303e2977de87eb4b4e" Mar 18 14:58:07 crc kubenswrapper[4756]: I0318 14:58:07.138545 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cc1f1584-6d11-4821-8a1d-4a58648313e3/rabbitmq/0.log" Mar 18 14:58:07 crc kubenswrapper[4756]: I0318 14:58:07.140521 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564098-hrrsk" Mar 18 14:58:07 crc kubenswrapper[4756]: I0318 14:58:07.258935 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtqzq\" (UniqueName: \"kubernetes.io/projected/51e5d910-0d11-4baa-987f-ffe6ba4e5e99-kube-api-access-qtqzq\") pod \"51e5d910-0d11-4baa-987f-ffe6ba4e5e99\" (UID: \"51e5d910-0d11-4baa-987f-ffe6ba4e5e99\") " Mar 18 14:58:07 crc kubenswrapper[4756]: I0318 14:58:07.272828 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51e5d910-0d11-4baa-987f-ffe6ba4e5e99-kube-api-access-qtqzq" (OuterVolumeSpecName: "kube-api-access-qtqzq") pod "51e5d910-0d11-4baa-987f-ffe6ba4e5e99" (UID: "51e5d910-0d11-4baa-987f-ffe6ba4e5e99"). InnerVolumeSpecName "kube-api-access-qtqzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:58:07 crc kubenswrapper[4756]: I0318 14:58:07.299425 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-t5fhc_b9d661cc-2616-43d2-9db7-7111e119a569/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:58:07 crc kubenswrapper[4756]: I0318 14:58:07.360839 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtqzq\" (UniqueName: \"kubernetes.io/projected/51e5d910-0d11-4baa-987f-ffe6ba4e5e99-kube-api-access-qtqzq\") on node \"crc\" DevicePath \"\"" Mar 18 14:58:07 crc kubenswrapper[4756]: I0318 14:58:07.493377 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-5m7kb_2fa72d83-fcee-4f4a-8105-15921e405491/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:58:07 crc kubenswrapper[4756]: I0318 14:58:07.564424 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-lzbfg_fc4e996a-8348-462e-87dd-552f33102a82/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:58:08 crc kubenswrapper[4756]: I0318 14:58:08.050842 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564098-hrrsk" Mar 18 14:58:08 crc kubenswrapper[4756]: I0318 14:58:08.272574 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564092-vjjw9"] Mar 18 14:58:08 crc kubenswrapper[4756]: I0318 14:58:08.287374 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564092-vjjw9"] Mar 18 14:58:08 crc kubenswrapper[4756]: I0318 14:58:08.358819 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-vb9nj_39717331-913f-4e8f-b7c1-e8f8148dcd92/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:58:08 crc kubenswrapper[4756]: I0318 14:58:08.454991 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-sh6cs_799f12f9-8c31-4518-a210-e117606e6d8e/ssh-known-hosts-edpm-deployment/0.log" Mar 18 14:58:08 crc kubenswrapper[4756]: I0318 14:58:08.710635 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7454cc5499-pkq5t_d3a58bb2-2a9a-4867-a60d-8ea354621ff6/proxy-server/0.log" Mar 18 14:58:08 crc kubenswrapper[4756]: I0318 14:58:08.853871 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7454cc5499-pkq5t_d3a58bb2-2a9a-4867-a60d-8ea354621ff6/proxy-httpd/0.log" Mar 18 14:58:09 crc kubenswrapper[4756]: I0318 14:58:09.124060 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-2crm4_00bfff2e-d59e-4936-b0e1-3476f2d01242/swift-ring-rebalance/0.log" Mar 18 14:58:09 crc kubenswrapper[4756]: I0318 14:58:09.236670 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/account-reaper/0.log" Mar 18 14:58:09 crc kubenswrapper[4756]: I0318 14:58:09.299755 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/account-auditor/0.log" Mar 18 14:58:09 crc kubenswrapper[4756]: I0318 14:58:09.346357 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b3619f6-418c-4ffc-81c3-b77245662820" path="/var/lib/kubelet/pods/9b3619f6-418c-4ffc-81c3-b77245662820/volumes" Mar 18 14:58:09 crc kubenswrapper[4756]: I0318 14:58:09.439031 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/account-server/0.log" Mar 18 14:58:09 crc kubenswrapper[4756]: I0318 14:58:09.502837 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/account-replicator/0.log" Mar 18 14:58:09 crc kubenswrapper[4756]: I0318 14:58:09.602550 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/container-auditor/0.log" Mar 18 14:58:09 crc kubenswrapper[4756]: I0318 14:58:09.753773 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/container-replicator/0.log" Mar 18 14:58:09 crc kubenswrapper[4756]: I0318 14:58:09.899506 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/container-server/0.log" Mar 18 14:58:09 crc kubenswrapper[4756]: I0318 14:58:09.934545 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/container-updater/0.log" Mar 18 14:58:10 crc kubenswrapper[4756]: I0318 14:58:10.031294 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/object-auditor/0.log" Mar 18 14:58:10 crc kubenswrapper[4756]: I0318 14:58:10.309344 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/object-expirer/0.log" Mar 18 14:58:10 crc kubenswrapper[4756]: I0318 14:58:10.322701 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/object-replicator/0.log" Mar 18 14:58:10 crc kubenswrapper[4756]: I0318 14:58:10.361401 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/object-server/0.log" Mar 18 14:58:10 crc kubenswrapper[4756]: I0318 14:58:10.425263 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/object-updater/0.log" Mar 18 14:58:10 crc kubenswrapper[4756]: I0318 14:58:10.666112 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/rsync/0.log" Mar 18 14:58:10 crc kubenswrapper[4756]: I0318 14:58:10.749441 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e4d16-f5d1-4367-ad0e-baf820923225/swift-recon-cron/0.log" Mar 18 14:58:10 crc kubenswrapper[4756]: I0318 14:58:10.842022 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4xx9c_c5555558-89a8-4faa-aeb3-0ee1110796be/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:58:11 crc kubenswrapper[4756]: I0318 14:58:11.140989 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5e78c973-235a-4dcb-927f-7a5d35e786cc/tempest-tests-tempest-tests-runner/0.log" Mar 18 14:58:11 crc kubenswrapper[4756]: I0318 14:58:11.308452 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_47cc7818-59ef-442f-9fce-df34f7275bc4/test-operator-logs-container/0.log" Mar 18 14:58:11 crc kubenswrapper[4756]: I0318 14:58:11.455817 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-spvxh_9422548d-30ba-46b8-a1b0-3a6dfa64bd70/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 14:58:11 crc kubenswrapper[4756]: I0318 14:58:11.973171 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_18faa7ad-0836-473a-aebe-0a6e5357b554/memcached/0.log" Mar 18 14:58:17 crc kubenswrapper[4756]: I0318 14:58:17.316737 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:58:17 crc kubenswrapper[4756]: E0318 14:58:17.317357 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:58:31 crc kubenswrapper[4756]: I0318 14:58:31.315073 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:58:31 crc kubenswrapper[4756]: E0318 14:58:31.315962 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 14:58:34 crc kubenswrapper[4756]: I0318 14:58:34.879944 4756 scope.go:117] "RemoveContainer" containerID="b78738cf7e954838da5435271146ff418efdd1a10ec15cdf2264881f6c5f4948" Mar 18 14:58:43 crc kubenswrapper[4756]: I0318 14:58:43.315955 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 14:58:44 crc kubenswrapper[4756]: I0318 14:58:44.496853 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"1da14f46d57c4462e93c15657dccaca66ada6e735fdcca329fc0a7b2046da6cd"} Mar 18 14:58:59 crc kubenswrapper[4756]: I0318 14:58:59.195165 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4_869373d9-b980-4c6a-80aa-3ab7a2e046a2/util/0.log" Mar 18 14:58:59 crc kubenswrapper[4756]: I0318 14:58:59.496809 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4_869373d9-b980-4c6a-80aa-3ab7a2e046a2/util/0.log" Mar 18 14:58:59 crc kubenswrapper[4756]: I0318 14:58:59.633376 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4_869373d9-b980-4c6a-80aa-3ab7a2e046a2/pull/0.log" Mar 18 14:58:59 crc kubenswrapper[4756]: I0318 14:58:59.636483 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4_869373d9-b980-4c6a-80aa-3ab7a2e046a2/pull/0.log" Mar 18 14:59:00 crc kubenswrapper[4756]: I0318 14:59:00.022987 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4_869373d9-b980-4c6a-80aa-3ab7a2e046a2/util/0.log" Mar 18 14:59:00 crc kubenswrapper[4756]: I0318 14:59:00.130853 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4_869373d9-b980-4c6a-80aa-3ab7a2e046a2/extract/0.log" Mar 18 14:59:00 crc kubenswrapper[4756]: I0318 14:59:00.142758 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_484aa3d9c3a8164f147add19be293dd9d32025354848ad7e57fd528e10v8th4_869373d9-b980-4c6a-80aa-3ab7a2e046a2/pull/0.log" Mar 18 14:59:00 crc kubenswrapper[4756]: I0318 14:59:00.366773 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-gmmtd_a5afd6e2-d647-4165-a9ef-506d7e16173c/manager/0.log" Mar 18 14:59:00 crc kubenswrapper[4756]: I0318 14:59:00.707851 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-mhqlh_d05621f7-0f1e-4b58-b016-6cdb083fed42/manager/0.log" Mar 18 14:59:01 crc kubenswrapper[4756]: I0318 14:59:01.242976 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-p6hhx_aab73c06-7468-4302-ab88-6c91308ca2ac/manager/0.log" Mar 18 14:59:01 crc kubenswrapper[4756]: I0318 14:59:01.419718 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-k87mt_6e2eb455-f2ff-40c0-9c26-67c675c1102f/manager/0.log" Mar 18 14:59:01 crc kubenswrapper[4756]: I0318 14:59:01.515902 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-mskn7_1a62c450-4f9c-4c7b-a864-2600eb6c8589/manager/0.log" Mar 18 14:59:01 crc kubenswrapper[4756]: I0318 14:59:01.535002 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-vgbf2_67b596ac-2e44-42b4-99b2-fd1c8712aaed/manager/0.log" Mar 18 14:59:01 crc kubenswrapper[4756]: I0318 14:59:01.877720 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-wx92j_400b73ce-6d8f-4392-b47b-fef88e8452bd/manager/0.log" Mar 18 14:59:02 crc kubenswrapper[4756]: I0318 14:59:02.318822 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-hwrx5_a632ee17-dd9e-4ec8-b281-7224395bd2fe/manager/0.log" Mar 18 14:59:02 crc kubenswrapper[4756]: I0318 14:59:02.332549 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-9lwv9_ba85373b-8f2d-4f13-8ea7-0648b49074da/manager/0.log" Mar 18 14:59:02 crc kubenswrapper[4756]: I0318 14:59:02.437977 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-bz77b_dfcb60c2-5eba-4f32-a738-8c79d6c36df7/manager/0.log" Mar 18 14:59:02 crc kubenswrapper[4756]: I0318 14:59:02.800440 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-fk8pc_83526c83-3dbe-42b4-a101-7ce37495b4cf/manager/0.log" Mar 18 14:59:02 crc kubenswrapper[4756]: I0318 14:59:02.847987 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-br68j_2efade7a-e4d1-43ab-a237-139b25a4163c/manager/0.log" Mar 18 14:59:03 crc kubenswrapper[4756]: I0318 14:59:03.122534 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-ggbzf_e377c7f8-bd46-4193-a288-91f593cc5a25/manager/0.log" Mar 18 14:59:03 crc kubenswrapper[4756]: I0318 14:59:03.282659 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-fzxth_5c617272-2409-489c-8093-3c943a117a23/manager/0.log" Mar 18 14:59:03 crc kubenswrapper[4756]: I0318 14:59:03.423027 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-m846d_d48d57d1-c314-4a7c-bd51-26ec5cfebbd1/manager/0.log" Mar 18 14:59:03 crc kubenswrapper[4756]: I0318 14:59:03.625516 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5847fcc4fb-jhc4m_96217702-f9ab-4f27-bcf1-f2bcd60058c0/operator/0.log" Mar 18 14:59:04 crc kubenswrapper[4756]: I0318 14:59:04.073567 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-j7ksc_6436d7cb-143a-4d2a-8fb7-9288ca809679/registry-server/0.log" Mar 18 14:59:04 crc kubenswrapper[4756]: I0318 14:59:04.577472 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-j9sw7_f5e9df6f-3f17-4a97-a335-332fb636f9dd/manager/0.log" Mar 18 14:59:04 crc kubenswrapper[4756]: I0318 14:59:04.854274 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-k6gpj_c469b9e9-509c-4265-bc75-3d80d75c4365/manager/0.log" Mar 18 14:59:04 crc kubenswrapper[4756]: I0318 14:59:04.972202 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-f84d7fd4f-tksnb_d78c665d-9f25-4d44-80eb-12324454e435/manager/0.log" Mar 18 14:59:05 crc kubenswrapper[4756]: I0318 14:59:05.144000 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-sd7vp_c37f78a1-6298-4f65-9dac-6f597ed75a31/operator/0.log" Mar 18 14:59:05 crc kubenswrapper[4756]: I0318 14:59:05.757654 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5b79d7bc79-kn5zn_726ef03d-4a13-449c-8866-c6f5ee240873/manager/0.log" Mar 18 14:59:05 crc kubenswrapper[4756]: I0318 14:59:05.788789 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-4pz7p_d87d4d63-23ac-4366-88d2-5d8803a7322e/manager/0.log" Mar 18 14:59:05 crc kubenswrapper[4756]: I0318 14:59:05.945761 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-phw4g_0a54e6d2-b6e2-4808-8c1d-e12b975702cb/manager/0.log" Mar 18 14:59:06 crc kubenswrapper[4756]: I0318 14:59:06.021656 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-wsh26_1187469f-b925-4823-8e9e-f3721c8b299b/manager/0.log" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.186052 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wls8z"] Mar 18 14:59:40 crc kubenswrapper[4756]: E0318 14:59:40.187964 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51e5d910-0d11-4baa-987f-ffe6ba4e5e99" containerName="oc" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.187980 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="51e5d910-0d11-4baa-987f-ffe6ba4e5e99" containerName="oc" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.188688 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="51e5d910-0d11-4baa-987f-ffe6ba4e5e99" containerName="oc" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.191558 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.208543 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wls8z"] Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.308301 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhklc\" (UniqueName: \"kubernetes.io/projected/4dddc9c6-f034-48d8-951b-ed880aac7ec6-kube-api-access-hhklc\") pod \"redhat-marketplace-wls8z\" (UID: \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\") " pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.308370 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dddc9c6-f034-48d8-951b-ed880aac7ec6-utilities\") pod \"redhat-marketplace-wls8z\" (UID: \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\") " pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.308434 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dddc9c6-f034-48d8-951b-ed880aac7ec6-catalog-content\") pod \"redhat-marketplace-wls8z\" (UID: \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\") " pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.409840 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhklc\" (UniqueName: \"kubernetes.io/projected/4dddc9c6-f034-48d8-951b-ed880aac7ec6-kube-api-access-hhklc\") pod \"redhat-marketplace-wls8z\" (UID: \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\") " pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.410196 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dddc9c6-f034-48d8-951b-ed880aac7ec6-utilities\") pod \"redhat-marketplace-wls8z\" (UID: \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\") " pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.410277 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dddc9c6-f034-48d8-951b-ed880aac7ec6-catalog-content\") pod \"redhat-marketplace-wls8z\" (UID: \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\") " pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.410633 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dddc9c6-f034-48d8-951b-ed880aac7ec6-utilities\") pod \"redhat-marketplace-wls8z\" (UID: \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\") " pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.410718 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dddc9c6-f034-48d8-951b-ed880aac7ec6-catalog-content\") pod \"redhat-marketplace-wls8z\" (UID: \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\") " pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.430458 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhklc\" (UniqueName: \"kubernetes.io/projected/4dddc9c6-f034-48d8-951b-ed880aac7ec6-kube-api-access-hhklc\") pod \"redhat-marketplace-wls8z\" (UID: \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\") " pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.530146 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 14:59:40 crc kubenswrapper[4756]: I0318 14:59:40.666543 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-br5r4_be0a5a79-545f-4411-8cb1-d9de4e87d983/control-plane-machine-set-operator/0.log" Mar 18 14:59:41 crc kubenswrapper[4756]: I0318 14:59:41.033419 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rs6j9_3985e570-6d23-4928-a018-40e9b5868b89/machine-api-operator/0.log" Mar 18 14:59:41 crc kubenswrapper[4756]: I0318 14:59:41.044664 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rs6j9_3985e570-6d23-4928-a018-40e9b5868b89/kube-rbac-proxy/0.log" Mar 18 14:59:41 crc kubenswrapper[4756]: I0318 14:59:41.329011 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wls8z"] Mar 18 14:59:41 crc kubenswrapper[4756]: W0318 14:59:41.331476 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dddc9c6_f034_48d8_951b_ed880aac7ec6.slice/crio-1d7891514c387a1de3002765cbf5912d87925bc1ce5293c1ee00140ff3604236 WatchSource:0}: Error finding container 1d7891514c387a1de3002765cbf5912d87925bc1ce5293c1ee00140ff3604236: Status 404 returned error can't find the container with id 1d7891514c387a1de3002765cbf5912d87925bc1ce5293c1ee00140ff3604236 Mar 18 14:59:41 crc kubenswrapper[4756]: I0318 14:59:41.986166 4756 generic.go:334] "Generic (PLEG): container finished" podID="4dddc9c6-f034-48d8-951b-ed880aac7ec6" containerID="03fd338110c34fc7cebc12a1f0ecf86193263fa28f52ad4bd53fe5eda77b6039" exitCode=0 Mar 18 14:59:41 crc kubenswrapper[4756]: I0318 14:59:41.986276 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wls8z" event={"ID":"4dddc9c6-f034-48d8-951b-ed880aac7ec6","Type":"ContainerDied","Data":"03fd338110c34fc7cebc12a1f0ecf86193263fa28f52ad4bd53fe5eda77b6039"} Mar 18 14:59:41 crc kubenswrapper[4756]: I0318 14:59:41.986412 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wls8z" event={"ID":"4dddc9c6-f034-48d8-951b-ed880aac7ec6","Type":"ContainerStarted","Data":"1d7891514c387a1de3002765cbf5912d87925bc1ce5293c1ee00140ff3604236"} Mar 18 14:59:41 crc kubenswrapper[4756]: I0318 14:59:41.988367 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:59:45 crc kubenswrapper[4756]: I0318 14:59:45.016772 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wls8z" event={"ID":"4dddc9c6-f034-48d8-951b-ed880aac7ec6","Type":"ContainerStarted","Data":"8508e4c4ba01f81cf6bc3d420267184b1826f7fa8388330322b507d9c0ec9f01"} Mar 18 14:59:46 crc kubenswrapper[4756]: I0318 14:59:46.027476 4756 generic.go:334] "Generic (PLEG): container finished" podID="4dddc9c6-f034-48d8-951b-ed880aac7ec6" containerID="8508e4c4ba01f81cf6bc3d420267184b1826f7fa8388330322b507d9c0ec9f01" exitCode=0 Mar 18 14:59:46 crc kubenswrapper[4756]: I0318 14:59:46.027542 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wls8z" event={"ID":"4dddc9c6-f034-48d8-951b-ed880aac7ec6","Type":"ContainerDied","Data":"8508e4c4ba01f81cf6bc3d420267184b1826f7fa8388330322b507d9c0ec9f01"} Mar 18 14:59:48 crc kubenswrapper[4756]: I0318 14:59:48.048678 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wls8z" event={"ID":"4dddc9c6-f034-48d8-951b-ed880aac7ec6","Type":"ContainerStarted","Data":"b4b65b57c5498feaf8746957d2d4e9eb757c6969de374a7a715d8765c416ff28"} Mar 18 14:59:48 crc kubenswrapper[4756]: I0318 14:59:48.075685 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wls8z" podStartSLOduration=3.163390571 podStartE2EDuration="8.075669274s" podCreationTimestamp="2026-03-18 14:59:40 +0000 UTC" firstStartedPulling="2026-03-18 14:59:41.988173627 +0000 UTC m=+3583.302591602" lastFinishedPulling="2026-03-18 14:59:46.90045233 +0000 UTC m=+3588.214870305" observedRunningTime="2026-03-18 14:59:48.067904853 +0000 UTC m=+3589.382322818" watchObservedRunningTime="2026-03-18 14:59:48.075669274 +0000 UTC m=+3589.390087249" Mar 18 14:59:50 crc kubenswrapper[4756]: I0318 14:59:50.530695 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 14:59:50 crc kubenswrapper[4756]: I0318 14:59:50.531334 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 14:59:50 crc kubenswrapper[4756]: I0318 14:59:50.601330 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.168028 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt"] Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.179529 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.183700 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.183823 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.192841 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564100-92vdr"] Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.212276 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564100-92vdr" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.219925 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt"] Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.222342 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.222442 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.234058 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.244645 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564100-92vdr"] Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.246002 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/363259bd-376a-4519-a06c-8b89d3c83f4f-secret-volume\") pod \"collect-profiles-29564100-6dfjt\" (UID: \"363259bd-376a-4519-a06c-8b89d3c83f4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.246174 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctz27\" (UniqueName: \"kubernetes.io/projected/363259bd-376a-4519-a06c-8b89d3c83f4f-kube-api-access-ctz27\") pod \"collect-profiles-29564100-6dfjt\" (UID: \"363259bd-376a-4519-a06c-8b89d3c83f4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.246235 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/363259bd-376a-4519-a06c-8b89d3c83f4f-config-volume\") pod \"collect-profiles-29564100-6dfjt\" (UID: \"363259bd-376a-4519-a06c-8b89d3c83f4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.246415 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7md2\" (UniqueName: \"kubernetes.io/projected/ae5ca546-1126-4c61-a42d-c561dbb8490d-kube-api-access-s7md2\") pod \"auto-csr-approver-29564100-92vdr\" (UID: \"ae5ca546-1126-4c61-a42d-c561dbb8490d\") " pod="openshift-infra/auto-csr-approver-29564100-92vdr" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.349325 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7md2\" (UniqueName: \"kubernetes.io/projected/ae5ca546-1126-4c61-a42d-c561dbb8490d-kube-api-access-s7md2\") pod \"auto-csr-approver-29564100-92vdr\" (UID: \"ae5ca546-1126-4c61-a42d-c561dbb8490d\") " pod="openshift-infra/auto-csr-approver-29564100-92vdr" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.349405 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/363259bd-376a-4519-a06c-8b89d3c83f4f-secret-volume\") pod \"collect-profiles-29564100-6dfjt\" (UID: \"363259bd-376a-4519-a06c-8b89d3c83f4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.349497 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctz27\" (UniqueName: \"kubernetes.io/projected/363259bd-376a-4519-a06c-8b89d3c83f4f-kube-api-access-ctz27\") pod \"collect-profiles-29564100-6dfjt\" (UID: \"363259bd-376a-4519-a06c-8b89d3c83f4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.349526 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/363259bd-376a-4519-a06c-8b89d3c83f4f-config-volume\") pod \"collect-profiles-29564100-6dfjt\" (UID: \"363259bd-376a-4519-a06c-8b89d3c83f4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.351406 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/363259bd-376a-4519-a06c-8b89d3c83f4f-config-volume\") pod \"collect-profiles-29564100-6dfjt\" (UID: \"363259bd-376a-4519-a06c-8b89d3c83f4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.362416 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/363259bd-376a-4519-a06c-8b89d3c83f4f-secret-volume\") pod \"collect-profiles-29564100-6dfjt\" (UID: \"363259bd-376a-4519-a06c-8b89d3c83f4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.365919 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7md2\" (UniqueName: \"kubernetes.io/projected/ae5ca546-1126-4c61-a42d-c561dbb8490d-kube-api-access-s7md2\") pod \"auto-csr-approver-29564100-92vdr\" (UID: \"ae5ca546-1126-4c61-a42d-c561dbb8490d\") " pod="openshift-infra/auto-csr-approver-29564100-92vdr" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.370763 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctz27\" (UniqueName: \"kubernetes.io/projected/363259bd-376a-4519-a06c-8b89d3c83f4f-kube-api-access-ctz27\") pod \"collect-profiles-29564100-6dfjt\" (UID: \"363259bd-376a-4519-a06c-8b89d3c83f4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.506454 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.557127 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564100-92vdr" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.646799 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 15:00:00 crc kubenswrapper[4756]: I0318 15:00:00.725996 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wls8z"] Mar 18 15:00:01 crc kubenswrapper[4756]: I0318 15:00:01.211864 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wls8z" podUID="4dddc9c6-f034-48d8-951b-ed880aac7ec6" containerName="registry-server" containerID="cri-o://b4b65b57c5498feaf8746957d2d4e9eb757c6969de374a7a715d8765c416ff28" gracePeriod=2 Mar 18 15:00:01 crc kubenswrapper[4756]: I0318 15:00:01.309899 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564100-92vdr"] Mar 18 15:00:01 crc kubenswrapper[4756]: I0318 15:00:01.881365 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt"] Mar 18 15:00:02 crc kubenswrapper[4756]: I0318 15:00:02.223907 4756 generic.go:334] "Generic (PLEG): container finished" podID="4dddc9c6-f034-48d8-951b-ed880aac7ec6" containerID="b4b65b57c5498feaf8746957d2d4e9eb757c6969de374a7a715d8765c416ff28" exitCode=0 Mar 18 15:00:02 crc kubenswrapper[4756]: I0318 15:00:02.224268 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wls8z" event={"ID":"4dddc9c6-f034-48d8-951b-ed880aac7ec6","Type":"ContainerDied","Data":"b4b65b57c5498feaf8746957d2d4e9eb757c6969de374a7a715d8765c416ff28"} Mar 18 15:00:02 crc kubenswrapper[4756]: I0318 15:00:02.226509 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564100-92vdr" event={"ID":"ae5ca546-1126-4c61-a42d-c561dbb8490d","Type":"ContainerStarted","Data":"d6f257ef806a923c9bb72d726b288772a5670a1342e54a5a3d4ac1aabc60096e"} Mar 18 15:00:02 crc kubenswrapper[4756]: I0318 15:00:02.228134 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" event={"ID":"363259bd-376a-4519-a06c-8b89d3c83f4f","Type":"ContainerStarted","Data":"b448d9f2c4e76b1c3ba67771e13c7757be0539206a4bd0752660caea7868315f"} Mar 18 15:00:02 crc kubenswrapper[4756]: I0318 15:00:02.733772 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 15:00:02 crc kubenswrapper[4756]: I0318 15:00:02.900732 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dddc9c6-f034-48d8-951b-ed880aac7ec6-catalog-content\") pod \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\" (UID: \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\") " Mar 18 15:00:02 crc kubenswrapper[4756]: I0318 15:00:02.901280 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dddc9c6-f034-48d8-951b-ed880aac7ec6-utilities\") pod \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\" (UID: \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\") " Mar 18 15:00:02 crc kubenswrapper[4756]: I0318 15:00:02.901466 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhklc\" (UniqueName: \"kubernetes.io/projected/4dddc9c6-f034-48d8-951b-ed880aac7ec6-kube-api-access-hhklc\") pod \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\" (UID: \"4dddc9c6-f034-48d8-951b-ed880aac7ec6\") " Mar 18 15:00:02 crc kubenswrapper[4756]: I0318 15:00:02.903250 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dddc9c6-f034-48d8-951b-ed880aac7ec6-utilities" (OuterVolumeSpecName: "utilities") pod "4dddc9c6-f034-48d8-951b-ed880aac7ec6" (UID: "4dddc9c6-f034-48d8-951b-ed880aac7ec6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:00:02 crc kubenswrapper[4756]: I0318 15:00:02.913361 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dddc9c6-f034-48d8-951b-ed880aac7ec6-kube-api-access-hhklc" (OuterVolumeSpecName: "kube-api-access-hhklc") pod "4dddc9c6-f034-48d8-951b-ed880aac7ec6" (UID: "4dddc9c6-f034-48d8-951b-ed880aac7ec6"). InnerVolumeSpecName "kube-api-access-hhklc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:00:02 crc kubenswrapper[4756]: I0318 15:00:02.936162 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dddc9c6-f034-48d8-951b-ed880aac7ec6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4dddc9c6-f034-48d8-951b-ed880aac7ec6" (UID: "4dddc9c6-f034-48d8-951b-ed880aac7ec6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:00:03 crc kubenswrapper[4756]: I0318 15:00:03.004465 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dddc9c6-f034-48d8-951b-ed880aac7ec6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:00:03 crc kubenswrapper[4756]: I0318 15:00:03.004495 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhklc\" (UniqueName: \"kubernetes.io/projected/4dddc9c6-f034-48d8-951b-ed880aac7ec6-kube-api-access-hhklc\") on node \"crc\" DevicePath \"\"" Mar 18 15:00:03 crc kubenswrapper[4756]: I0318 15:00:03.004507 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dddc9c6-f034-48d8-951b-ed880aac7ec6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:00:03 crc kubenswrapper[4756]: I0318 15:00:03.237239 4756 generic.go:334] "Generic (PLEG): container finished" podID="363259bd-376a-4519-a06c-8b89d3c83f4f" containerID="6a40503e2dc2f4569bb29ab2dd49632e0c42e11a93d6b827a7c26516272ce9eb" exitCode=0 Mar 18 15:00:03 crc kubenswrapper[4756]: I0318 15:00:03.237314 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" event={"ID":"363259bd-376a-4519-a06c-8b89d3c83f4f","Type":"ContainerDied","Data":"6a40503e2dc2f4569bb29ab2dd49632e0c42e11a93d6b827a7c26516272ce9eb"} Mar 18 15:00:03 crc kubenswrapper[4756]: I0318 15:00:03.239301 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wls8z" event={"ID":"4dddc9c6-f034-48d8-951b-ed880aac7ec6","Type":"ContainerDied","Data":"1d7891514c387a1de3002765cbf5912d87925bc1ce5293c1ee00140ff3604236"} Mar 18 15:00:03 crc kubenswrapper[4756]: I0318 15:00:03.239347 4756 scope.go:117] "RemoveContainer" containerID="b4b65b57c5498feaf8746957d2d4e9eb757c6969de374a7a715d8765c416ff28" Mar 18 15:00:03 crc kubenswrapper[4756]: I0318 15:00:03.239384 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wls8z" Mar 18 15:00:03 crc kubenswrapper[4756]: I0318 15:00:03.263995 4756 scope.go:117] "RemoveContainer" containerID="8508e4c4ba01f81cf6bc3d420267184b1826f7fa8388330322b507d9c0ec9f01" Mar 18 15:00:03 crc kubenswrapper[4756]: I0318 15:00:03.322238 4756 scope.go:117] "RemoveContainer" containerID="03fd338110c34fc7cebc12a1f0ecf86193263fa28f52ad4bd53fe5eda77b6039" Mar 18 15:00:03 crc kubenswrapper[4756]: I0318 15:00:03.360030 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wls8z"] Mar 18 15:00:03 crc kubenswrapper[4756]: I0318 15:00:03.360065 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wls8z"] Mar 18 15:00:05 crc kubenswrapper[4756]: I0318 15:00:05.336548 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dddc9c6-f034-48d8-951b-ed880aac7ec6" path="/var/lib/kubelet/pods/4dddc9c6-f034-48d8-951b-ed880aac7ec6/volumes" Mar 18 15:00:05 crc kubenswrapper[4756]: I0318 15:00:05.386183 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" Mar 18 15:00:05 crc kubenswrapper[4756]: I0318 15:00:05.476567 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctz27\" (UniqueName: \"kubernetes.io/projected/363259bd-376a-4519-a06c-8b89d3c83f4f-kube-api-access-ctz27\") pod \"363259bd-376a-4519-a06c-8b89d3c83f4f\" (UID: \"363259bd-376a-4519-a06c-8b89d3c83f4f\") " Mar 18 15:00:05 crc kubenswrapper[4756]: I0318 15:00:05.476830 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/363259bd-376a-4519-a06c-8b89d3c83f4f-config-volume\") pod \"363259bd-376a-4519-a06c-8b89d3c83f4f\" (UID: \"363259bd-376a-4519-a06c-8b89d3c83f4f\") " Mar 18 15:00:05 crc kubenswrapper[4756]: I0318 15:00:05.478441 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/363259bd-376a-4519-a06c-8b89d3c83f4f-config-volume" (OuterVolumeSpecName: "config-volume") pod "363259bd-376a-4519-a06c-8b89d3c83f4f" (UID: "363259bd-376a-4519-a06c-8b89d3c83f4f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 15:00:05 crc kubenswrapper[4756]: I0318 15:00:05.493334 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/363259bd-376a-4519-a06c-8b89d3c83f4f-kube-api-access-ctz27" (OuterVolumeSpecName: "kube-api-access-ctz27") pod "363259bd-376a-4519-a06c-8b89d3c83f4f" (UID: "363259bd-376a-4519-a06c-8b89d3c83f4f"). InnerVolumeSpecName "kube-api-access-ctz27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:00:05 crc kubenswrapper[4756]: I0318 15:00:05.578792 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/363259bd-376a-4519-a06c-8b89d3c83f4f-secret-volume\") pod \"363259bd-376a-4519-a06c-8b89d3c83f4f\" (UID: \"363259bd-376a-4519-a06c-8b89d3c83f4f\") " Mar 18 15:00:05 crc kubenswrapper[4756]: I0318 15:00:05.579552 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctz27\" (UniqueName: \"kubernetes.io/projected/363259bd-376a-4519-a06c-8b89d3c83f4f-kube-api-access-ctz27\") on node \"crc\" DevicePath \"\"" Mar 18 15:00:05 crc kubenswrapper[4756]: I0318 15:00:05.579570 4756 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/363259bd-376a-4519-a06c-8b89d3c83f4f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:00:05 crc kubenswrapper[4756]: I0318 15:00:05.581790 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/363259bd-376a-4519-a06c-8b89d3c83f4f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "363259bd-376a-4519-a06c-8b89d3c83f4f" (UID: "363259bd-376a-4519-a06c-8b89d3c83f4f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:00:05 crc kubenswrapper[4756]: I0318 15:00:05.681453 4756 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/363259bd-376a-4519-a06c-8b89d3c83f4f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 15:00:06 crc kubenswrapper[4756]: I0318 15:00:06.279674 4756 generic.go:334] "Generic (PLEG): container finished" podID="ae5ca546-1126-4c61-a42d-c561dbb8490d" containerID="5d10793298ced0ca9dca32db7e81398462a119a0f084762400fd3865129a7acc" exitCode=0 Mar 18 15:00:06 crc kubenswrapper[4756]: I0318 15:00:06.279764 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564100-92vdr" event={"ID":"ae5ca546-1126-4c61-a42d-c561dbb8490d","Type":"ContainerDied","Data":"5d10793298ced0ca9dca32db7e81398462a119a0f084762400fd3865129a7acc"} Mar 18 15:00:06 crc kubenswrapper[4756]: I0318 15:00:06.281890 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" event={"ID":"363259bd-376a-4519-a06c-8b89d3c83f4f","Type":"ContainerDied","Data":"b448d9f2c4e76b1c3ba67771e13c7757be0539206a4bd0752660caea7868315f"} Mar 18 15:00:06 crc kubenswrapper[4756]: I0318 15:00:06.281925 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b448d9f2c4e76b1c3ba67771e13c7757be0539206a4bd0752660caea7868315f" Mar 18 15:00:06 crc kubenswrapper[4756]: I0318 15:00:06.282103 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564100-6dfjt" Mar 18 15:00:06 crc kubenswrapper[4756]: I0318 15:00:06.494858 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s"] Mar 18 15:00:06 crc kubenswrapper[4756]: I0318 15:00:06.527005 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564055-bkp2s"] Mar 18 15:00:06 crc kubenswrapper[4756]: I0318 15:00:06.638879 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-7t9v8_456f41d7-b340-4611-914b-cb23b10b8644/cert-manager-controller/0.log" Mar 18 15:00:06 crc kubenswrapper[4756]: I0318 15:00:06.860035 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-n75wg_59b96a93-c409-494f-9c33-3cf8612a5c3c/cert-manager-cainjector/0.log" Mar 18 15:00:07 crc kubenswrapper[4756]: I0318 15:00:07.172629 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-9dtmc_0b2d2be8-2089-4bf6-9dec-ac1070616f89/cert-manager-webhook/0.log" Mar 18 15:00:07 crc kubenswrapper[4756]: I0318 15:00:07.331530 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34dec4ac-ef61-4769-a369-e0f463c78467" path="/var/lib/kubelet/pods/34dec4ac-ef61-4769-a369-e0f463c78467/volumes" Mar 18 15:00:08 crc kubenswrapper[4756]: I0318 15:00:08.385779 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564100-92vdr" Mar 18 15:00:08 crc kubenswrapper[4756]: I0318 15:00:08.534935 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7md2\" (UniqueName: \"kubernetes.io/projected/ae5ca546-1126-4c61-a42d-c561dbb8490d-kube-api-access-s7md2\") pod \"ae5ca546-1126-4c61-a42d-c561dbb8490d\" (UID: \"ae5ca546-1126-4c61-a42d-c561dbb8490d\") " Mar 18 15:00:08 crc kubenswrapper[4756]: I0318 15:00:08.542000 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae5ca546-1126-4c61-a42d-c561dbb8490d-kube-api-access-s7md2" (OuterVolumeSpecName: "kube-api-access-s7md2") pod "ae5ca546-1126-4c61-a42d-c561dbb8490d" (UID: "ae5ca546-1126-4c61-a42d-c561dbb8490d"). InnerVolumeSpecName "kube-api-access-s7md2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:00:08 crc kubenswrapper[4756]: I0318 15:00:08.637622 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7md2\" (UniqueName: \"kubernetes.io/projected/ae5ca546-1126-4c61-a42d-c561dbb8490d-kube-api-access-s7md2\") on node \"crc\" DevicePath \"\"" Mar 18 15:00:09 crc kubenswrapper[4756]: I0318 15:00:09.308765 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564100-92vdr" event={"ID":"ae5ca546-1126-4c61-a42d-c561dbb8490d","Type":"ContainerDied","Data":"d6f257ef806a923c9bb72d726b288772a5670a1342e54a5a3d4ac1aabc60096e"} Mar 18 15:00:09 crc kubenswrapper[4756]: I0318 15:00:09.309000 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6f257ef806a923c9bb72d726b288772a5670a1342e54a5a3d4ac1aabc60096e" Mar 18 15:00:09 crc kubenswrapper[4756]: I0318 15:00:09.309047 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564100-92vdr" Mar 18 15:00:09 crc kubenswrapper[4756]: I0318 15:00:09.456354 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564094-kpsmp"] Mar 18 15:00:09 crc kubenswrapper[4756]: I0318 15:00:09.475602 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564094-kpsmp"] Mar 18 15:00:11 crc kubenswrapper[4756]: I0318 15:00:11.328903 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f63def4-e2ed-400f-952f-ac41a3d00414" path="/var/lib/kubelet/pods/6f63def4-e2ed-400f-952f-ac41a3d00414/volumes" Mar 18 15:00:32 crc kubenswrapper[4756]: I0318 15:00:32.756770 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-5gg88_a4975a7f-ddbb-46e3-91be-b1a7757abced/nmstate-console-plugin/0.log" Mar 18 15:00:33 crc kubenswrapper[4756]: I0318 15:00:33.058880 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-qd4sc_f175bb68-1110-4701-b4a1-9eb04330fdb2/nmstate-handler/0.log" Mar 18 15:00:33 crc kubenswrapper[4756]: I0318 15:00:33.574152 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-895gl_c81cea89-3315-4647-8596-e0132b8dd763/kube-rbac-proxy/0.log" Mar 18 15:00:33 crc kubenswrapper[4756]: I0318 15:00:33.603179 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-895gl_c81cea89-3315-4647-8596-e0132b8dd763/nmstate-metrics/0.log" Mar 18 15:00:33 crc kubenswrapper[4756]: I0318 15:00:33.919886 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-bjjcv_bc5a9dab-2f6a-456b-83de-eb45d03c4062/nmstate-operator/0.log" Mar 18 15:00:33 crc kubenswrapper[4756]: I0318 15:00:33.925646 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-65v9t_593a9109-9e37-47a7-b467-de7be3502ba4/nmstate-webhook/0.log" Mar 18 15:00:34 crc kubenswrapper[4756]: I0318 15:00:34.964720 4756 scope.go:117] "RemoveContainer" containerID="73c64e3b6232d7aaa06d468af86b569fa9f44e9ab820edcefac196f11f754c5d" Mar 18 15:00:34 crc kubenswrapper[4756]: I0318 15:00:34.996636 4756 scope.go:117] "RemoveContainer" containerID="512b6fd45de2b3ccc2b133724475a3166dc5b70175af817a19685cf8aa9525ef" Mar 18 15:00:59 crc kubenswrapper[4756]: I0318 15:00:59.328592 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d4b6cd968-2lpml_84e944f2-90e7-4c7a-802d-703a8ef82200/manager/0.log" Mar 18 15:00:59 crc kubenswrapper[4756]: I0318 15:00:59.493337 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d4b6cd968-2lpml_84e944f2-90e7-4c7a-802d-703a8ef82200/kube-rbac-proxy/0.log" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.157725 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29564101-hl2r4"] Mar 18 15:01:00 crc kubenswrapper[4756]: E0318 15:01:00.158497 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dddc9c6-f034-48d8-951b-ed880aac7ec6" containerName="registry-server" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.158513 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dddc9c6-f034-48d8-951b-ed880aac7ec6" containerName="registry-server" Mar 18 15:01:00 crc kubenswrapper[4756]: E0318 15:01:00.158529 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dddc9c6-f034-48d8-951b-ed880aac7ec6" containerName="extract-utilities" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.158537 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dddc9c6-f034-48d8-951b-ed880aac7ec6" containerName="extract-utilities" Mar 18 15:01:00 crc kubenswrapper[4756]: E0318 15:01:00.158564 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dddc9c6-f034-48d8-951b-ed880aac7ec6" containerName="extract-content" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.158572 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dddc9c6-f034-48d8-951b-ed880aac7ec6" containerName="extract-content" Mar 18 15:01:00 crc kubenswrapper[4756]: E0318 15:01:00.158588 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5ca546-1126-4c61-a42d-c561dbb8490d" containerName="oc" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.158594 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5ca546-1126-4c61-a42d-c561dbb8490d" containerName="oc" Mar 18 15:01:00 crc kubenswrapper[4756]: E0318 15:01:00.158606 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363259bd-376a-4519-a06c-8b89d3c83f4f" containerName="collect-profiles" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.158614 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="363259bd-376a-4519-a06c-8b89d3c83f4f" containerName="collect-profiles" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.158845 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5ca546-1126-4c61-a42d-c561dbb8490d" containerName="oc" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.158867 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="363259bd-376a-4519-a06c-8b89d3c83f4f" containerName="collect-profiles" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.158889 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dddc9c6-f034-48d8-951b-ed880aac7ec6" containerName="registry-server" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.159781 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.180596 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564101-hl2r4"] Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.193688 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-config-data\") pod \"keystone-cron-29564101-hl2r4\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.193838 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-fernet-keys\") pod \"keystone-cron-29564101-hl2r4\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.193950 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kdhb\" (UniqueName: \"kubernetes.io/projected/2700a709-bc12-4aa6-a962-6e796330517f-kube-api-access-8kdhb\") pod \"keystone-cron-29564101-hl2r4\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.193970 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-combined-ca-bundle\") pod \"keystone-cron-29564101-hl2r4\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.295670 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-fernet-keys\") pod \"keystone-cron-29564101-hl2r4\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.296052 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kdhb\" (UniqueName: \"kubernetes.io/projected/2700a709-bc12-4aa6-a962-6e796330517f-kube-api-access-8kdhb\") pod \"keystone-cron-29564101-hl2r4\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.296163 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-combined-ca-bundle\") pod \"keystone-cron-29564101-hl2r4\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.296296 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-config-data\") pod \"keystone-cron-29564101-hl2r4\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.304342 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-config-data\") pod \"keystone-cron-29564101-hl2r4\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.312047 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-fernet-keys\") pod \"keystone-cron-29564101-hl2r4\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.319024 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kdhb\" (UniqueName: \"kubernetes.io/projected/2700a709-bc12-4aa6-a962-6e796330517f-kube-api-access-8kdhb\") pod \"keystone-cron-29564101-hl2r4\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.319093 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-combined-ca-bundle\") pod \"keystone-cron-29564101-hl2r4\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:00 crc kubenswrapper[4756]: I0318 15:01:00.515185 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:01 crc kubenswrapper[4756]: I0318 15:01:01.234212 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564101-hl2r4"] Mar 18 15:01:01 crc kubenswrapper[4756]: I0318 15:01:01.806054 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564101-hl2r4" event={"ID":"2700a709-bc12-4aa6-a962-6e796330517f","Type":"ContainerStarted","Data":"7a9e7a2ea120f874d278c6af4f40c32f49c50d618f4e6b0586dee053d3bd613b"} Mar 18 15:01:01 crc kubenswrapper[4756]: I0318 15:01:01.807441 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564101-hl2r4" event={"ID":"2700a709-bc12-4aa6-a962-6e796330517f","Type":"ContainerStarted","Data":"3512b0c824b8e72368ba427623eee66d1a0a849d44158381667ca8e211b5b78e"} Mar 18 15:01:01 crc kubenswrapper[4756]: I0318 15:01:01.835260 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29564101-hl2r4" podStartSLOduration=1.835238183 podStartE2EDuration="1.835238183s" podCreationTimestamp="2026-03-18 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 15:01:01.824773799 +0000 UTC m=+3663.139191774" watchObservedRunningTime="2026-03-18 15:01:01.835238183 +0000 UTC m=+3663.149656158" Mar 18 15:01:03 crc kubenswrapper[4756]: I0318 15:01:03.246883 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jjkd8"] Mar 18 15:01:03 crc kubenswrapper[4756]: I0318 15:01:03.249274 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:03 crc kubenswrapper[4756]: I0318 15:01:03.264041 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjkd8"] Mar 18 15:01:03 crc kubenswrapper[4756]: I0318 15:01:03.354161 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq2rs\" (UniqueName: \"kubernetes.io/projected/1accbcd6-3491-4377-ac4f-0e82900678b4-kube-api-access-hq2rs\") pod \"certified-operators-jjkd8\" (UID: \"1accbcd6-3491-4377-ac4f-0e82900678b4\") " pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:03 crc kubenswrapper[4756]: I0318 15:01:03.354309 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1accbcd6-3491-4377-ac4f-0e82900678b4-catalog-content\") pod \"certified-operators-jjkd8\" (UID: \"1accbcd6-3491-4377-ac4f-0e82900678b4\") " pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:03 crc kubenswrapper[4756]: I0318 15:01:03.354383 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1accbcd6-3491-4377-ac4f-0e82900678b4-utilities\") pod \"certified-operators-jjkd8\" (UID: \"1accbcd6-3491-4377-ac4f-0e82900678b4\") " pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:03 crc kubenswrapper[4756]: I0318 15:01:03.456373 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1accbcd6-3491-4377-ac4f-0e82900678b4-utilities\") pod \"certified-operators-jjkd8\" (UID: \"1accbcd6-3491-4377-ac4f-0e82900678b4\") " pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:03 crc kubenswrapper[4756]: I0318 15:01:03.456431 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq2rs\" (UniqueName: \"kubernetes.io/projected/1accbcd6-3491-4377-ac4f-0e82900678b4-kube-api-access-hq2rs\") pod \"certified-operators-jjkd8\" (UID: \"1accbcd6-3491-4377-ac4f-0e82900678b4\") " pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:03 crc kubenswrapper[4756]: I0318 15:01:03.456690 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1accbcd6-3491-4377-ac4f-0e82900678b4-catalog-content\") pod \"certified-operators-jjkd8\" (UID: \"1accbcd6-3491-4377-ac4f-0e82900678b4\") " pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:03 crc kubenswrapper[4756]: I0318 15:01:03.457304 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1accbcd6-3491-4377-ac4f-0e82900678b4-utilities\") pod \"certified-operators-jjkd8\" (UID: \"1accbcd6-3491-4377-ac4f-0e82900678b4\") " pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:03 crc kubenswrapper[4756]: I0318 15:01:03.457380 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1accbcd6-3491-4377-ac4f-0e82900678b4-catalog-content\") pod \"certified-operators-jjkd8\" (UID: \"1accbcd6-3491-4377-ac4f-0e82900678b4\") " pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:03 crc kubenswrapper[4756]: I0318 15:01:03.477614 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq2rs\" (UniqueName: \"kubernetes.io/projected/1accbcd6-3491-4377-ac4f-0e82900678b4-kube-api-access-hq2rs\") pod \"certified-operators-jjkd8\" (UID: \"1accbcd6-3491-4377-ac4f-0e82900678b4\") " pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:03 crc kubenswrapper[4756]: I0318 15:01:03.565261 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:04 crc kubenswrapper[4756]: I0318 15:01:04.432229 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjkd8"] Mar 18 15:01:04 crc kubenswrapper[4756]: W0318 15:01:04.439413 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1accbcd6_3491_4377_ac4f_0e82900678b4.slice/crio-762cfc0e4259f22b65dde77dba353cd1900daf39c88bdf7025d30d937942184c WatchSource:0}: Error finding container 762cfc0e4259f22b65dde77dba353cd1900daf39c88bdf7025d30d937942184c: Status 404 returned error can't find the container with id 762cfc0e4259f22b65dde77dba353cd1900daf39c88bdf7025d30d937942184c Mar 18 15:01:04 crc kubenswrapper[4756]: I0318 15:01:04.847990 4756 generic.go:334] "Generic (PLEG): container finished" podID="1accbcd6-3491-4377-ac4f-0e82900678b4" containerID="6c1afdf9b3533c257168f51235af3a8748449b09163a7103139e5e3091166eea" exitCode=0 Mar 18 15:01:04 crc kubenswrapper[4756]: I0318 15:01:04.848583 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjkd8" event={"ID":"1accbcd6-3491-4377-ac4f-0e82900678b4","Type":"ContainerDied","Data":"6c1afdf9b3533c257168f51235af3a8748449b09163a7103139e5e3091166eea"} Mar 18 15:01:04 crc kubenswrapper[4756]: I0318 15:01:04.848693 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjkd8" event={"ID":"1accbcd6-3491-4377-ac4f-0e82900678b4","Type":"ContainerStarted","Data":"762cfc0e4259f22b65dde77dba353cd1900daf39c88bdf7025d30d937942184c"} Mar 18 15:01:06 crc kubenswrapper[4756]: I0318 15:01:06.868372 4756 generic.go:334] "Generic (PLEG): container finished" podID="2700a709-bc12-4aa6-a962-6e796330517f" containerID="7a9e7a2ea120f874d278c6af4f40c32f49c50d618f4e6b0586dee053d3bd613b" exitCode=0 Mar 18 15:01:06 crc kubenswrapper[4756]: I0318 15:01:06.868534 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564101-hl2r4" event={"ID":"2700a709-bc12-4aa6-a962-6e796330517f","Type":"ContainerDied","Data":"7a9e7a2ea120f874d278c6af4f40c32f49c50d618f4e6b0586dee053d3bd613b"} Mar 18 15:01:06 crc kubenswrapper[4756]: I0318 15:01:06.914922 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:01:06 crc kubenswrapper[4756]: I0318 15:01:06.914976 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:01:08 crc kubenswrapper[4756]: I0318 15:01:08.888737 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjkd8" event={"ID":"1accbcd6-3491-4377-ac4f-0e82900678b4","Type":"ContainerStarted","Data":"f6f617b68cee5caffd57f6a090d822ec5ea69c2b0fe070c9d823189650dc496b"} Mar 18 15:01:08 crc kubenswrapper[4756]: I0318 15:01:08.993995 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.117940 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kdhb\" (UniqueName: \"kubernetes.io/projected/2700a709-bc12-4aa6-a962-6e796330517f-kube-api-access-8kdhb\") pod \"2700a709-bc12-4aa6-a962-6e796330517f\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.117986 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-config-data\") pod \"2700a709-bc12-4aa6-a962-6e796330517f\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.118121 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-combined-ca-bundle\") pod \"2700a709-bc12-4aa6-a962-6e796330517f\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.118339 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-fernet-keys\") pod \"2700a709-bc12-4aa6-a962-6e796330517f\" (UID: \"2700a709-bc12-4aa6-a962-6e796330517f\") " Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.125758 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2700a709-bc12-4aa6-a962-6e796330517f" (UID: "2700a709-bc12-4aa6-a962-6e796330517f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.133304 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2700a709-bc12-4aa6-a962-6e796330517f-kube-api-access-8kdhb" (OuterVolumeSpecName: "kube-api-access-8kdhb") pod "2700a709-bc12-4aa6-a962-6e796330517f" (UID: "2700a709-bc12-4aa6-a962-6e796330517f"). InnerVolumeSpecName "kube-api-access-8kdhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.156307 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2700a709-bc12-4aa6-a962-6e796330517f" (UID: "2700a709-bc12-4aa6-a962-6e796330517f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.186589 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-config-data" (OuterVolumeSpecName: "config-data") pod "2700a709-bc12-4aa6-a962-6e796330517f" (UID: "2700a709-bc12-4aa6-a962-6e796330517f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.222558 4756 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.222600 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kdhb\" (UniqueName: \"kubernetes.io/projected/2700a709-bc12-4aa6-a962-6e796330517f-kube-api-access-8kdhb\") on node \"crc\" DevicePath \"\"" Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.222611 4756 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.222619 4756 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2700a709-bc12-4aa6-a962-6e796330517f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.898764 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564101-hl2r4" Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.898763 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564101-hl2r4" event={"ID":"2700a709-bc12-4aa6-a962-6e796330517f","Type":"ContainerDied","Data":"3512b0c824b8e72368ba427623eee66d1a0a849d44158381667ca8e211b5b78e"} Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.899162 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3512b0c824b8e72368ba427623eee66d1a0a849d44158381667ca8e211b5b78e" Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.903618 4756 generic.go:334] "Generic (PLEG): container finished" podID="1accbcd6-3491-4377-ac4f-0e82900678b4" containerID="f6f617b68cee5caffd57f6a090d822ec5ea69c2b0fe070c9d823189650dc496b" exitCode=0 Mar 18 15:01:09 crc kubenswrapper[4756]: I0318 15:01:09.903675 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjkd8" event={"ID":"1accbcd6-3491-4377-ac4f-0e82900678b4","Type":"ContainerDied","Data":"f6f617b68cee5caffd57f6a090d822ec5ea69c2b0fe070c9d823189650dc496b"} Mar 18 15:01:10 crc kubenswrapper[4756]: I0318 15:01:10.913956 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjkd8" event={"ID":"1accbcd6-3491-4377-ac4f-0e82900678b4","Type":"ContainerStarted","Data":"a69d7510915b5a72e6393fac39f215a4813fff21d195745374653a5c83a85eef"} Mar 18 15:01:10 crc kubenswrapper[4756]: I0318 15:01:10.953760 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jjkd8" podStartSLOduration=2.214658102 podStartE2EDuration="7.953744758s" podCreationTimestamp="2026-03-18 15:01:03 +0000 UTC" firstStartedPulling="2026-03-18 15:01:04.850475719 +0000 UTC m=+3666.164893694" lastFinishedPulling="2026-03-18 15:01:10.589562375 +0000 UTC m=+3671.903980350" observedRunningTime="2026-03-18 15:01:10.950007866 +0000 UTC m=+3672.264425841" watchObservedRunningTime="2026-03-18 15:01:10.953744758 +0000 UTC m=+3672.268162733" Mar 18 15:01:13 crc kubenswrapper[4756]: I0318 15:01:13.566329 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:13 crc kubenswrapper[4756]: I0318 15:01:13.567481 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:13 crc kubenswrapper[4756]: I0318 15:01:13.618179 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:23 crc kubenswrapper[4756]: I0318 15:01:23.631278 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:23 crc kubenswrapper[4756]: I0318 15:01:23.684253 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjkd8"] Mar 18 15:01:24 crc kubenswrapper[4756]: I0318 15:01:24.030186 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jjkd8" podUID="1accbcd6-3491-4377-ac4f-0e82900678b4" containerName="registry-server" containerID="cri-o://a69d7510915b5a72e6393fac39f215a4813fff21d195745374653a5c83a85eef" gracePeriod=2 Mar 18 15:01:24 crc kubenswrapper[4756]: I0318 15:01:24.937477 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.060258 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1accbcd6-3491-4377-ac4f-0e82900678b4-utilities\") pod \"1accbcd6-3491-4377-ac4f-0e82900678b4\" (UID: \"1accbcd6-3491-4377-ac4f-0e82900678b4\") " Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.060352 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq2rs\" (UniqueName: \"kubernetes.io/projected/1accbcd6-3491-4377-ac4f-0e82900678b4-kube-api-access-hq2rs\") pod \"1accbcd6-3491-4377-ac4f-0e82900678b4\" (UID: \"1accbcd6-3491-4377-ac4f-0e82900678b4\") " Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.060379 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1accbcd6-3491-4377-ac4f-0e82900678b4-catalog-content\") pod \"1accbcd6-3491-4377-ac4f-0e82900678b4\" (UID: \"1accbcd6-3491-4377-ac4f-0e82900678b4\") " Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.065216 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1accbcd6-3491-4377-ac4f-0e82900678b4-utilities" (OuterVolumeSpecName: "utilities") pod "1accbcd6-3491-4377-ac4f-0e82900678b4" (UID: "1accbcd6-3491-4377-ac4f-0e82900678b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.068521 4756 generic.go:334] "Generic (PLEG): container finished" podID="1accbcd6-3491-4377-ac4f-0e82900678b4" containerID="a69d7510915b5a72e6393fac39f215a4813fff21d195745374653a5c83a85eef" exitCode=0 Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.068576 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjkd8" event={"ID":"1accbcd6-3491-4377-ac4f-0e82900678b4","Type":"ContainerDied","Data":"a69d7510915b5a72e6393fac39f215a4813fff21d195745374653a5c83a85eef"} Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.068603 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjkd8" event={"ID":"1accbcd6-3491-4377-ac4f-0e82900678b4","Type":"ContainerDied","Data":"762cfc0e4259f22b65dde77dba353cd1900daf39c88bdf7025d30d937942184c"} Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.068619 4756 scope.go:117] "RemoveContainer" containerID="a69d7510915b5a72e6393fac39f215a4813fff21d195745374653a5c83a85eef" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.068756 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjkd8" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.075575 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1accbcd6-3491-4377-ac4f-0e82900678b4-kube-api-access-hq2rs" (OuterVolumeSpecName: "kube-api-access-hq2rs") pod "1accbcd6-3491-4377-ac4f-0e82900678b4" (UID: "1accbcd6-3491-4377-ac4f-0e82900678b4"). InnerVolumeSpecName "kube-api-access-hq2rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.159939 4756 scope.go:117] "RemoveContainer" containerID="f6f617b68cee5caffd57f6a090d822ec5ea69c2b0fe070c9d823189650dc496b" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.165318 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1accbcd6-3491-4377-ac4f-0e82900678b4-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.165360 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq2rs\" (UniqueName: \"kubernetes.io/projected/1accbcd6-3491-4377-ac4f-0e82900678b4-kube-api-access-hq2rs\") on node \"crc\" DevicePath \"\"" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.177551 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1accbcd6-3491-4377-ac4f-0e82900678b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1accbcd6-3491-4377-ac4f-0e82900678b4" (UID: "1accbcd6-3491-4377-ac4f-0e82900678b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.198744 4756 scope.go:117] "RemoveContainer" containerID="6c1afdf9b3533c257168f51235af3a8748449b09163a7103139e5e3091166eea" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.233231 4756 scope.go:117] "RemoveContainer" containerID="a69d7510915b5a72e6393fac39f215a4813fff21d195745374653a5c83a85eef" Mar 18 15:01:25 crc kubenswrapper[4756]: E0318 15:01:25.233657 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69d7510915b5a72e6393fac39f215a4813fff21d195745374653a5c83a85eef\": container with ID starting with a69d7510915b5a72e6393fac39f215a4813fff21d195745374653a5c83a85eef not found: ID does not exist" containerID="a69d7510915b5a72e6393fac39f215a4813fff21d195745374653a5c83a85eef" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.233697 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69d7510915b5a72e6393fac39f215a4813fff21d195745374653a5c83a85eef"} err="failed to get container status \"a69d7510915b5a72e6393fac39f215a4813fff21d195745374653a5c83a85eef\": rpc error: code = NotFound desc = could not find container \"a69d7510915b5a72e6393fac39f215a4813fff21d195745374653a5c83a85eef\": container with ID starting with a69d7510915b5a72e6393fac39f215a4813fff21d195745374653a5c83a85eef not found: ID does not exist" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.233723 4756 scope.go:117] "RemoveContainer" containerID="f6f617b68cee5caffd57f6a090d822ec5ea69c2b0fe070c9d823189650dc496b" Mar 18 15:01:25 crc kubenswrapper[4756]: E0318 15:01:25.234055 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f617b68cee5caffd57f6a090d822ec5ea69c2b0fe070c9d823189650dc496b\": container with ID starting with f6f617b68cee5caffd57f6a090d822ec5ea69c2b0fe070c9d823189650dc496b not found: ID does not exist" containerID="f6f617b68cee5caffd57f6a090d822ec5ea69c2b0fe070c9d823189650dc496b" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.234094 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f617b68cee5caffd57f6a090d822ec5ea69c2b0fe070c9d823189650dc496b"} err="failed to get container status \"f6f617b68cee5caffd57f6a090d822ec5ea69c2b0fe070c9d823189650dc496b\": rpc error: code = NotFound desc = could not find container \"f6f617b68cee5caffd57f6a090d822ec5ea69c2b0fe070c9d823189650dc496b\": container with ID starting with f6f617b68cee5caffd57f6a090d822ec5ea69c2b0fe070c9d823189650dc496b not found: ID does not exist" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.234154 4756 scope.go:117] "RemoveContainer" containerID="6c1afdf9b3533c257168f51235af3a8748449b09163a7103139e5e3091166eea" Mar 18 15:01:25 crc kubenswrapper[4756]: E0318 15:01:25.234422 4756 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c1afdf9b3533c257168f51235af3a8748449b09163a7103139e5e3091166eea\": container with ID starting with 6c1afdf9b3533c257168f51235af3a8748449b09163a7103139e5e3091166eea not found: ID does not exist" containerID="6c1afdf9b3533c257168f51235af3a8748449b09163a7103139e5e3091166eea" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.234444 4756 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1afdf9b3533c257168f51235af3a8748449b09163a7103139e5e3091166eea"} err="failed to get container status \"6c1afdf9b3533c257168f51235af3a8748449b09163a7103139e5e3091166eea\": rpc error: code = NotFound desc = could not find container \"6c1afdf9b3533c257168f51235af3a8748449b09163a7103139e5e3091166eea\": container with ID starting with 6c1afdf9b3533c257168f51235af3a8748449b09163a7103139e5e3091166eea not found: ID does not exist" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.267294 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1accbcd6-3491-4377-ac4f-0e82900678b4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.392093 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjkd8"] Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.402022 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jjkd8"] Mar 18 15:01:25 crc kubenswrapper[4756]: I0318 15:01:25.692836 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-6kvvd_df2f3290-a194-4fa7-9c5c-533c329bc34b/prometheus-operator/0.log" Mar 18 15:01:26 crc kubenswrapper[4756]: I0318 15:01:26.012645 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_b1ddfb6b-2173-4ddc-84aa-437858c62a2a/prometheus-operator-admission-webhook/0.log" Mar 18 15:01:26 crc kubenswrapper[4756]: I0318 15:01:26.023551 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_7cef672e-9e83-4a19-90e0-8d078a871e02/prometheus-operator-admission-webhook/0.log" Mar 18 15:01:26 crc kubenswrapper[4756]: I0318 15:01:26.387664 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-6cdcccbffc-hw5bz_5044ac67-cd21-42cd-8fc4-63d7a532038d/perses-operator/0.log" Mar 18 15:01:26 crc kubenswrapper[4756]: I0318 15:01:26.507698 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-7rvzd_626fab88-a2be-43fb-9679-6324c7105bd9/operator/0.log" Mar 18 15:01:27 crc kubenswrapper[4756]: I0318 15:01:27.325434 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1accbcd6-3491-4377-ac4f-0e82900678b4" path="/var/lib/kubelet/pods/1accbcd6-3491-4377-ac4f-0e82900678b4/volumes" Mar 18 15:01:36 crc kubenswrapper[4756]: I0318 15:01:36.916942 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:01:36 crc kubenswrapper[4756]: I0318 15:01:36.917402 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:01:51 crc kubenswrapper[4756]: I0318 15:01:51.141928 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zwr45_660a6ced-cc4a-4447-91ca-aa0b46e7c6ef/controller/0.log" Mar 18 15:01:51 crc kubenswrapper[4756]: I0318 15:01:51.253328 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zwr45_660a6ced-cc4a-4447-91ca-aa0b46e7c6ef/kube-rbac-proxy/0.log" Mar 18 15:01:51 crc kubenswrapper[4756]: I0318 15:01:51.500604 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/cp-frr-files/0.log" Mar 18 15:01:51 crc kubenswrapper[4756]: I0318 15:01:51.925246 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/cp-frr-files/0.log" Mar 18 15:01:51 crc kubenswrapper[4756]: I0318 15:01:51.997392 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/cp-metrics/0.log" Mar 18 15:01:52 crc kubenswrapper[4756]: I0318 15:01:52.012606 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/cp-reloader/0.log" Mar 18 15:01:52 crc kubenswrapper[4756]: I0318 15:01:52.091288 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/cp-reloader/0.log" Mar 18 15:01:52 crc kubenswrapper[4756]: I0318 15:01:52.309776 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/cp-frr-files/0.log" Mar 18 15:01:52 crc kubenswrapper[4756]: I0318 15:01:52.366104 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/cp-reloader/0.log" Mar 18 15:01:52 crc kubenswrapper[4756]: I0318 15:01:52.396075 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/cp-metrics/0.log" Mar 18 15:01:52 crc kubenswrapper[4756]: I0318 15:01:52.474383 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/cp-metrics/0.log" Mar 18 15:01:52 crc kubenswrapper[4756]: I0318 15:01:52.836906 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/cp-frr-files/0.log" Mar 18 15:01:52 crc kubenswrapper[4756]: I0318 15:01:52.836942 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/cp-reloader/0.log" Mar 18 15:01:52 crc kubenswrapper[4756]: I0318 15:01:52.884219 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/cp-metrics/0.log" Mar 18 15:01:52 crc kubenswrapper[4756]: I0318 15:01:52.911252 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/controller/0.log" Mar 18 15:01:53 crc kubenswrapper[4756]: I0318 15:01:53.158908 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/frr-metrics/0.log" Mar 18 15:01:53 crc kubenswrapper[4756]: I0318 15:01:53.228901 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/kube-rbac-proxy-frr/0.log" Mar 18 15:01:53 crc kubenswrapper[4756]: I0318 15:01:53.261816 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/kube-rbac-proxy/0.log" Mar 18 15:01:53 crc kubenswrapper[4756]: I0318 15:01:53.651827 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jpbxl_4663e088-ab8a-4f74-b7e9-f4d258772346/frr-k8s-webhook-server/0.log" Mar 18 15:01:53 crc kubenswrapper[4756]: I0318 15:01:53.774562 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/reloader/0.log" Mar 18 15:01:54 crc kubenswrapper[4756]: I0318 15:01:54.208350 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-64f4bf856c-2qfht_7f228f4e-f45a-4cf9-b7d0-b4f44a143c17/manager/0.log" Mar 18 15:01:54 crc kubenswrapper[4756]: I0318 15:01:54.319602 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fpdph_abdfd85a-457d-4a04-bebb-4aae02b3a1ce/frr/0.log" Mar 18 15:01:54 crc kubenswrapper[4756]: I0318 15:01:54.371364 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7989c5c544-j8vcg_e9fab363-b0f2-4beb-ae91-d3cfdac9e407/webhook-server/0.log" Mar 18 15:01:54 crc kubenswrapper[4756]: I0318 15:01:54.729314 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5knbh_f7b4594e-5e27-402a-ab35-ff01fd5392eb/kube-rbac-proxy/0.log" Mar 18 15:01:54 crc kubenswrapper[4756]: I0318 15:01:54.910880 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5knbh_f7b4594e-5e27-402a-ab35-ff01fd5392eb/speaker/0.log" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.145209 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564102-ddts6"] Mar 18 15:02:00 crc kubenswrapper[4756]: E0318 15:02:00.146074 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1accbcd6-3491-4377-ac4f-0e82900678b4" containerName="extract-content" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.146086 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1accbcd6-3491-4377-ac4f-0e82900678b4" containerName="extract-content" Mar 18 15:02:00 crc kubenswrapper[4756]: E0318 15:02:00.146098 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1accbcd6-3491-4377-ac4f-0e82900678b4" containerName="registry-server" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.146106 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1accbcd6-3491-4377-ac4f-0e82900678b4" containerName="registry-server" Mar 18 15:02:00 crc kubenswrapper[4756]: E0318 15:02:00.146136 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1accbcd6-3491-4377-ac4f-0e82900678b4" containerName="extract-utilities" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.146145 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="1accbcd6-3491-4377-ac4f-0e82900678b4" containerName="extract-utilities" Mar 18 15:02:00 crc kubenswrapper[4756]: E0318 15:02:00.146153 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2700a709-bc12-4aa6-a962-6e796330517f" containerName="keystone-cron" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.146159 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="2700a709-bc12-4aa6-a962-6e796330517f" containerName="keystone-cron" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.146339 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="1accbcd6-3491-4377-ac4f-0e82900678b4" containerName="registry-server" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.146356 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="2700a709-bc12-4aa6-a962-6e796330517f" containerName="keystone-cron" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.147106 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564102-ddts6" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.151620 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.151630 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.151764 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.158721 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564102-ddts6"] Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.243234 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl5rc\" (UniqueName: \"kubernetes.io/projected/4599defb-1667-4008-8eed-a68665159cac-kube-api-access-rl5rc\") pod \"auto-csr-approver-29564102-ddts6\" (UID: \"4599defb-1667-4008-8eed-a68665159cac\") " pod="openshift-infra/auto-csr-approver-29564102-ddts6" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.345791 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl5rc\" (UniqueName: \"kubernetes.io/projected/4599defb-1667-4008-8eed-a68665159cac-kube-api-access-rl5rc\") pod \"auto-csr-approver-29564102-ddts6\" (UID: \"4599defb-1667-4008-8eed-a68665159cac\") " pod="openshift-infra/auto-csr-approver-29564102-ddts6" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.372909 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl5rc\" (UniqueName: \"kubernetes.io/projected/4599defb-1667-4008-8eed-a68665159cac-kube-api-access-rl5rc\") pod \"auto-csr-approver-29564102-ddts6\" (UID: \"4599defb-1667-4008-8eed-a68665159cac\") " pod="openshift-infra/auto-csr-approver-29564102-ddts6" Mar 18 15:02:00 crc kubenswrapper[4756]: I0318 15:02:00.468230 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564102-ddts6" Mar 18 15:02:01 crc kubenswrapper[4756]: I0318 15:02:01.158838 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564102-ddts6"] Mar 18 15:02:01 crc kubenswrapper[4756]: I0318 15:02:01.422377 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564102-ddts6" event={"ID":"4599defb-1667-4008-8eed-a68665159cac","Type":"ContainerStarted","Data":"def83e3ee4464c102219fa0d7a181c2bbbfa8d52b13061b61b45304109c34999"} Mar 18 15:02:06 crc kubenswrapper[4756]: I0318 15:02:06.914935 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:02:06 crc kubenswrapper[4756]: I0318 15:02:06.915397 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:02:06 crc kubenswrapper[4756]: I0318 15:02:06.915443 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 15:02:06 crc kubenswrapper[4756]: I0318 15:02:06.916226 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1da14f46d57c4462e93c15657dccaca66ada6e735fdcca329fc0a7b2046da6cd"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:02:06 crc kubenswrapper[4756]: I0318 15:02:06.916274 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://1da14f46d57c4462e93c15657dccaca66ada6e735fdcca329fc0a7b2046da6cd" gracePeriod=600 Mar 18 15:02:07 crc kubenswrapper[4756]: I0318 15:02:07.478144 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="1da14f46d57c4462e93c15657dccaca66ada6e735fdcca329fc0a7b2046da6cd" exitCode=0 Mar 18 15:02:07 crc kubenswrapper[4756]: I0318 15:02:07.478228 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"1da14f46d57c4462e93c15657dccaca66ada6e735fdcca329fc0a7b2046da6cd"} Mar 18 15:02:07 crc kubenswrapper[4756]: I0318 15:02:07.478523 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4"} Mar 18 15:02:07 crc kubenswrapper[4756]: I0318 15:02:07.478546 4756 scope.go:117] "RemoveContainer" containerID="31e53c736b1de5e6ad191ee3c82a1d0cf3b80d2ab83b10587bebc26737f9bf2b" Mar 18 15:02:10 crc kubenswrapper[4756]: I0318 15:02:10.522155 4756 generic.go:334] "Generic (PLEG): container finished" podID="4599defb-1667-4008-8eed-a68665159cac" containerID="be48b0ceea894b98ef1b989c5d5ea085d855db986d656f8222a6efa420786932" exitCode=0 Mar 18 15:02:10 crc kubenswrapper[4756]: I0318 15:02:10.522270 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564102-ddts6" event={"ID":"4599defb-1667-4008-8eed-a68665159cac","Type":"ContainerDied","Data":"be48b0ceea894b98ef1b989c5d5ea085d855db986d656f8222a6efa420786932"} Mar 18 15:02:12 crc kubenswrapper[4756]: I0318 15:02:12.818217 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564102-ddts6" Mar 18 15:02:12 crc kubenswrapper[4756]: I0318 15:02:12.925414 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl5rc\" (UniqueName: \"kubernetes.io/projected/4599defb-1667-4008-8eed-a68665159cac-kube-api-access-rl5rc\") pod \"4599defb-1667-4008-8eed-a68665159cac\" (UID: \"4599defb-1667-4008-8eed-a68665159cac\") " Mar 18 15:02:12 crc kubenswrapper[4756]: I0318 15:02:12.934406 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4599defb-1667-4008-8eed-a68665159cac-kube-api-access-rl5rc" (OuterVolumeSpecName: "kube-api-access-rl5rc") pod "4599defb-1667-4008-8eed-a68665159cac" (UID: "4599defb-1667-4008-8eed-a68665159cac"). InnerVolumeSpecName "kube-api-access-rl5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:02:13 crc kubenswrapper[4756]: I0318 15:02:13.027960 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl5rc\" (UniqueName: \"kubernetes.io/projected/4599defb-1667-4008-8eed-a68665159cac-kube-api-access-rl5rc\") on node \"crc\" DevicePath \"\"" Mar 18 15:02:13 crc kubenswrapper[4756]: I0318 15:02:13.554638 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564102-ddts6" event={"ID":"4599defb-1667-4008-8eed-a68665159cac","Type":"ContainerDied","Data":"def83e3ee4464c102219fa0d7a181c2bbbfa8d52b13061b61b45304109c34999"} Mar 18 15:02:13 crc kubenswrapper[4756]: I0318 15:02:13.554698 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="def83e3ee4464c102219fa0d7a181c2bbbfa8d52b13061b61b45304109c34999" Mar 18 15:02:13 crc kubenswrapper[4756]: I0318 15:02:13.554780 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564102-ddts6" Mar 18 15:02:13 crc kubenswrapper[4756]: I0318 15:02:13.891569 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564096-fbk77"] Mar 18 15:02:13 crc kubenswrapper[4756]: I0318 15:02:13.900323 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564096-fbk77"] Mar 18 15:02:15 crc kubenswrapper[4756]: I0318 15:02:15.326485 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93b0b30-b12f-4891-9245-9a2878902bbd" path="/var/lib/kubelet/pods/f93b0b30-b12f-4891-9245-9a2878902bbd/volumes" Mar 18 15:02:17 crc kubenswrapper[4756]: I0318 15:02:17.478289 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r_b0198b01-80fd-4196-b81c-fbe69a187c18/util/0.log" Mar 18 15:02:18 crc kubenswrapper[4756]: I0318 15:02:18.089007 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r_b0198b01-80fd-4196-b81c-fbe69a187c18/util/0.log" Mar 18 15:02:18 crc kubenswrapper[4756]: I0318 15:02:18.114922 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r_b0198b01-80fd-4196-b81c-fbe69a187c18/pull/0.log" Mar 18 15:02:18 crc kubenswrapper[4756]: I0318 15:02:18.192082 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r_b0198b01-80fd-4196-b81c-fbe69a187c18/pull/0.log" Mar 18 15:02:18 crc kubenswrapper[4756]: I0318 15:02:18.521228 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r_b0198b01-80fd-4196-b81c-fbe69a187c18/extract/0.log" Mar 18 15:02:18 crc kubenswrapper[4756]: I0318 15:02:18.546502 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r_b0198b01-80fd-4196-b81c-fbe69a187c18/pull/0.log" Mar 18 15:02:18 crc kubenswrapper[4756]: I0318 15:02:18.646750 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kqw7r_b0198b01-80fd-4196-b81c-fbe69a187c18/util/0.log" Mar 18 15:02:18 crc kubenswrapper[4756]: I0318 15:02:18.930469 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f_d8383b75-db63-4c4e-b9c1-3fad8e459899/util/0.log" Mar 18 15:02:19 crc kubenswrapper[4756]: I0318 15:02:19.251501 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f_d8383b75-db63-4c4e-b9c1-3fad8e459899/pull/0.log" Mar 18 15:02:19 crc kubenswrapper[4756]: I0318 15:02:19.296812 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f_d8383b75-db63-4c4e-b9c1-3fad8e459899/util/0.log" Mar 18 15:02:19 crc kubenswrapper[4756]: I0318 15:02:19.406956 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f_d8383b75-db63-4c4e-b9c1-3fad8e459899/pull/0.log" Mar 18 15:02:19 crc kubenswrapper[4756]: I0318 15:02:19.656894 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f_d8383b75-db63-4c4e-b9c1-3fad8e459899/pull/0.log" Mar 18 15:02:19 crc kubenswrapper[4756]: I0318 15:02:19.662260 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f_d8383b75-db63-4c4e-b9c1-3fad8e459899/util/0.log" Mar 18 15:02:19 crc kubenswrapper[4756]: I0318 15:02:19.701575 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c14hr8f_d8383b75-db63-4c4e-b9c1-3fad8e459899/extract/0.log" Mar 18 15:02:19 crc kubenswrapper[4756]: I0318 15:02:19.950255 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv_da51cda2-ab80-4d1a-b074-534c572d5803/util/0.log" Mar 18 15:02:20 crc kubenswrapper[4756]: I0318 15:02:20.312196 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv_da51cda2-ab80-4d1a-b074-534c572d5803/pull/0.log" Mar 18 15:02:20 crc kubenswrapper[4756]: I0318 15:02:20.322568 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv_da51cda2-ab80-4d1a-b074-534c572d5803/util/0.log" Mar 18 15:02:20 crc kubenswrapper[4756]: I0318 15:02:20.431871 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv_da51cda2-ab80-4d1a-b074-534c572d5803/pull/0.log" Mar 18 15:02:20 crc kubenswrapper[4756]: I0318 15:02:20.667989 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv_da51cda2-ab80-4d1a-b074-534c572d5803/extract/0.log" Mar 18 15:02:20 crc kubenswrapper[4756]: I0318 15:02:20.793858 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv_da51cda2-ab80-4d1a-b074-534c572d5803/util/0.log" Mar 18 15:02:20 crc kubenswrapper[4756]: I0318 15:02:20.800726 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g7rrv_da51cda2-ab80-4d1a-b074-534c572d5803/pull/0.log" Mar 18 15:02:20 crc kubenswrapper[4756]: I0318 15:02:20.988147 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d_81416b4c-681c-434a-9932-bbcc4ed16d11/util/0.log" Mar 18 15:02:21 crc kubenswrapper[4756]: I0318 15:02:21.329473 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d_81416b4c-681c-434a-9932-bbcc4ed16d11/pull/0.log" Mar 18 15:02:21 crc kubenswrapper[4756]: I0318 15:02:21.350068 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d_81416b4c-681c-434a-9932-bbcc4ed16d11/util/0.log" Mar 18 15:02:21 crc kubenswrapper[4756]: I0318 15:02:21.368928 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d_81416b4c-681c-434a-9932-bbcc4ed16d11/pull/0.log" Mar 18 15:02:21 crc kubenswrapper[4756]: I0318 15:02:21.785247 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d_81416b4c-681c-434a-9932-bbcc4ed16d11/util/0.log" Mar 18 15:02:21 crc kubenswrapper[4756]: I0318 15:02:21.825713 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d_81416b4c-681c-434a-9932-bbcc4ed16d11/extract/0.log" Mar 18 15:02:21 crc kubenswrapper[4756]: I0318 15:02:21.942709 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcjxp8d_81416b4c-681c-434a-9932-bbcc4ed16d11/pull/0.log" Mar 18 15:02:22 crc kubenswrapper[4756]: I0318 15:02:22.131902 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4r2hb_72438dca-b50b-4607-a61b-a6935f1b296a/extract-utilities/0.log" Mar 18 15:02:22 crc kubenswrapper[4756]: I0318 15:02:22.450227 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4r2hb_72438dca-b50b-4607-a61b-a6935f1b296a/extract-utilities/0.log" Mar 18 15:02:22 crc kubenswrapper[4756]: I0318 15:02:22.489029 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4r2hb_72438dca-b50b-4607-a61b-a6935f1b296a/extract-content/0.log" Mar 18 15:02:22 crc kubenswrapper[4756]: I0318 15:02:22.622303 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4r2hb_72438dca-b50b-4607-a61b-a6935f1b296a/extract-content/0.log" Mar 18 15:02:22 crc kubenswrapper[4756]: I0318 15:02:22.802365 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4r2hb_72438dca-b50b-4607-a61b-a6935f1b296a/extract-utilities/0.log" Mar 18 15:02:23 crc kubenswrapper[4756]: I0318 15:02:23.236925 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zph2b_6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e/extract-utilities/0.log" Mar 18 15:02:23 crc kubenswrapper[4756]: I0318 15:02:23.274368 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4r2hb_72438dca-b50b-4607-a61b-a6935f1b296a/extract-content/0.log" Mar 18 15:02:23 crc kubenswrapper[4756]: I0318 15:02:23.631804 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4r2hb_72438dca-b50b-4607-a61b-a6935f1b296a/registry-server/0.log" Mar 18 15:02:23 crc kubenswrapper[4756]: I0318 15:02:23.720973 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zph2b_6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e/extract-utilities/0.log" Mar 18 15:02:23 crc kubenswrapper[4756]: I0318 15:02:23.832848 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zph2b_6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e/extract-content/0.log" Mar 18 15:02:23 crc kubenswrapper[4756]: I0318 15:02:23.883435 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zph2b_6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e/extract-content/0.log" Mar 18 15:02:23 crc kubenswrapper[4756]: I0318 15:02:23.961368 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zph2b_6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e/extract-utilities/0.log" Mar 18 15:02:24 crc kubenswrapper[4756]: I0318 15:02:24.087346 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zph2b_6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e/extract-content/0.log" Mar 18 15:02:24 crc kubenswrapper[4756]: I0318 15:02:24.154607 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-45k2m_df050fc3-f811-40d6-a005-b9dc7062fdf5/marketplace-operator/0.log" Mar 18 15:02:24 crc kubenswrapper[4756]: I0318 15:02:24.161655 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zph2b_6b0a222b-e5d0-4d2d-84ba-43cc19a4b05e/registry-server/0.log" Mar 18 15:02:24 crc kubenswrapper[4756]: I0318 15:02:24.433574 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tbqbv_5f40a331-0e50-4142-86f5-b0a5a3162f3b/extract-utilities/0.log" Mar 18 15:02:24 crc kubenswrapper[4756]: I0318 15:02:24.791770 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tbqbv_5f40a331-0e50-4142-86f5-b0a5a3162f3b/extract-content/0.log" Mar 18 15:02:24 crc kubenswrapper[4756]: I0318 15:02:24.888315 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tbqbv_5f40a331-0e50-4142-86f5-b0a5a3162f3b/extract-content/0.log" Mar 18 15:02:24 crc kubenswrapper[4756]: I0318 15:02:24.931337 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tbqbv_5f40a331-0e50-4142-86f5-b0a5a3162f3b/extract-utilities/0.log" Mar 18 15:02:25 crc kubenswrapper[4756]: I0318 15:02:25.151316 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tbqbv_5f40a331-0e50-4142-86f5-b0a5a3162f3b/extract-utilities/0.log" Mar 18 15:02:25 crc kubenswrapper[4756]: I0318 15:02:25.231466 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tbqbv_5f40a331-0e50-4142-86f5-b0a5a3162f3b/registry-server/0.log" Mar 18 15:02:25 crc kubenswrapper[4756]: I0318 15:02:25.310482 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tbqbv_5f40a331-0e50-4142-86f5-b0a5a3162f3b/extract-content/0.log" Mar 18 15:02:25 crc kubenswrapper[4756]: I0318 15:02:25.313628 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwdj7_998c254a-7789-4cf4-9445-1f6a76068bd0/extract-utilities/0.log" Mar 18 15:02:25 crc kubenswrapper[4756]: I0318 15:02:25.709063 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwdj7_998c254a-7789-4cf4-9445-1f6a76068bd0/extract-utilities/0.log" Mar 18 15:02:25 crc kubenswrapper[4756]: I0318 15:02:25.717834 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwdj7_998c254a-7789-4cf4-9445-1f6a76068bd0/extract-content/0.log" Mar 18 15:02:25 crc kubenswrapper[4756]: I0318 15:02:25.866643 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwdj7_998c254a-7789-4cf4-9445-1f6a76068bd0/extract-content/0.log" Mar 18 15:02:26 crc kubenswrapper[4756]: I0318 15:02:26.124873 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwdj7_998c254a-7789-4cf4-9445-1f6a76068bd0/extract-utilities/0.log" Mar 18 15:02:26 crc kubenswrapper[4756]: I0318 15:02:26.213636 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwdj7_998c254a-7789-4cf4-9445-1f6a76068bd0/extract-content/0.log" Mar 18 15:02:26 crc kubenswrapper[4756]: I0318 15:02:26.590650 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwdj7_998c254a-7789-4cf4-9445-1f6a76068bd0/registry-server/0.log" Mar 18 15:02:35 crc kubenswrapper[4756]: I0318 15:02:35.169255 4756 scope.go:117] "RemoveContainer" containerID="b307208590d4faab1576bde2d6ec771d0e43e3062bb8529045c6c10ac0782c02" Mar 18 15:02:35 crc kubenswrapper[4756]: I0318 15:02:35.191268 4756 scope.go:117] "RemoveContainer" containerID="19ad716af839482e84ea38fc7d2df2aa09ca3d17323b40a6aabcaa425f3c5a0f" Mar 18 15:02:35 crc kubenswrapper[4756]: I0318 15:02:35.239184 4756 scope.go:117] "RemoveContainer" containerID="df9bc9c3728533fd8250aa1a3aa67b047c41ec19cfc177ab862a68ed1e57aba7" Mar 18 15:02:49 crc kubenswrapper[4756]: I0318 15:02:49.078845 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-6kvvd_df2f3290-a194-4fa7-9c5c-533c329bc34b/prometheus-operator/0.log" Mar 18 15:02:49 crc kubenswrapper[4756]: I0318 15:02:49.143703 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6ff97cbdb4-wcczp_7cef672e-9e83-4a19-90e0-8d078a871e02/prometheus-operator-admission-webhook/0.log" Mar 18 15:02:49 crc kubenswrapper[4756]: I0318 15:02:49.290974 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6ff97cbdb4-49rhw_b1ddfb6b-2173-4ddc-84aa-437858c62a2a/prometheus-operator-admission-webhook/0.log" Mar 18 15:02:49 crc kubenswrapper[4756]: I0318 15:02:49.431043 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-6cdcccbffc-hw5bz_5044ac67-cd21-42cd-8fc4-63d7a532038d/perses-operator/0.log" Mar 18 15:02:49 crc kubenswrapper[4756]: I0318 15:02:49.452775 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-7rvzd_626fab88-a2be-43fb-9679-6324c7105bd9/operator/0.log" Mar 18 15:03:11 crc kubenswrapper[4756]: I0318 15:03:11.770419 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d4b6cd968-2lpml_84e944f2-90e7-4c7a-802d-703a8ef82200/kube-rbac-proxy/0.log" Mar 18 15:03:11 crc kubenswrapper[4756]: I0318 15:03:11.834644 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d4b6cd968-2lpml_84e944f2-90e7-4c7a-802d-703a8ef82200/manager/0.log" Mar 18 15:03:17 crc kubenswrapper[4756]: E0318 15:03:17.338600 4756 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.34:34414->38.129.56.34:39757: write tcp 38.129.56.34:34414->38.129.56.34:39757: write: broken pipe Mar 18 15:03:35 crc kubenswrapper[4756]: I0318 15:03:35.370466 4756 scope.go:117] "RemoveContainer" containerID="3636ed4ef77bf1ae76d810ed4f925bb6b7d09b0c9da0b5263437f979b73af531" Mar 18 15:03:35 crc kubenswrapper[4756]: I0318 15:03:35.415013 4756 scope.go:117] "RemoveContainer" containerID="faa9f41709e23c7a597da8f5e9747ad0367651efd12b7ea89c0614567c751b4c" Mar 18 15:03:35 crc kubenswrapper[4756]: I0318 15:03:35.480269 4756 scope.go:117] "RemoveContainer" containerID="4da3470224078e85fc26dc0700f936f6d0b1cf6df78d569e43ddd15c296579da" Mar 18 15:04:00 crc kubenswrapper[4756]: I0318 15:04:00.143741 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564104-ph72j"] Mar 18 15:04:00 crc kubenswrapper[4756]: E0318 15:04:00.144533 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4599defb-1667-4008-8eed-a68665159cac" containerName="oc" Mar 18 15:04:00 crc kubenswrapper[4756]: I0318 15:04:00.144545 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="4599defb-1667-4008-8eed-a68665159cac" containerName="oc" Mar 18 15:04:00 crc kubenswrapper[4756]: I0318 15:04:00.144742 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="4599defb-1667-4008-8eed-a68665159cac" containerName="oc" Mar 18 15:04:00 crc kubenswrapper[4756]: I0318 15:04:00.145488 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564104-ph72j" Mar 18 15:04:00 crc kubenswrapper[4756]: I0318 15:04:00.147802 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:04:00 crc kubenswrapper[4756]: I0318 15:04:00.147958 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:04:00 crc kubenswrapper[4756]: I0318 15:04:00.148757 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 15:04:00 crc kubenswrapper[4756]: I0318 15:04:00.192539 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564104-ph72j"] Mar 18 15:04:00 crc kubenswrapper[4756]: I0318 15:04:00.307239 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l75rq\" (UniqueName: \"kubernetes.io/projected/600f14cc-1ff0-45ca-88ff-8f1fd727609a-kube-api-access-l75rq\") pod \"auto-csr-approver-29564104-ph72j\" (UID: \"600f14cc-1ff0-45ca-88ff-8f1fd727609a\") " pod="openshift-infra/auto-csr-approver-29564104-ph72j" Mar 18 15:04:00 crc kubenswrapper[4756]: I0318 15:04:00.409455 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l75rq\" (UniqueName: \"kubernetes.io/projected/600f14cc-1ff0-45ca-88ff-8f1fd727609a-kube-api-access-l75rq\") pod \"auto-csr-approver-29564104-ph72j\" (UID: \"600f14cc-1ff0-45ca-88ff-8f1fd727609a\") " pod="openshift-infra/auto-csr-approver-29564104-ph72j" Mar 18 15:04:00 crc kubenswrapper[4756]: I0318 15:04:00.442803 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l75rq\" (UniqueName: \"kubernetes.io/projected/600f14cc-1ff0-45ca-88ff-8f1fd727609a-kube-api-access-l75rq\") pod \"auto-csr-approver-29564104-ph72j\" (UID: \"600f14cc-1ff0-45ca-88ff-8f1fd727609a\") " pod="openshift-infra/auto-csr-approver-29564104-ph72j" Mar 18 15:04:00 crc kubenswrapper[4756]: I0318 15:04:00.464764 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564104-ph72j" Mar 18 15:04:01 crc kubenswrapper[4756]: I0318 15:04:01.229460 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564104-ph72j"] Mar 18 15:04:01 crc kubenswrapper[4756]: I0318 15:04:01.514888 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564104-ph72j" event={"ID":"600f14cc-1ff0-45ca-88ff-8f1fd727609a","Type":"ContainerStarted","Data":"4dcde130d1c0ef248f6e499449f5b08f3fecae7e347ef3403e8a229452976f94"} Mar 18 15:04:09 crc kubenswrapper[4756]: I0318 15:04:09.587544 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564104-ph72j" event={"ID":"600f14cc-1ff0-45ca-88ff-8f1fd727609a","Type":"ContainerStarted","Data":"eceb29f19419aefc27b24dd5ee84d167b328f162fb7994519cb8c55dc6883c82"} Mar 18 15:04:09 crc kubenswrapper[4756]: I0318 15:04:09.604546 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564104-ph72j" podStartSLOduration=2.347168004 podStartE2EDuration="9.604522517s" podCreationTimestamp="2026-03-18 15:04:00 +0000 UTC" firstStartedPulling="2026-03-18 15:04:01.242308214 +0000 UTC m=+3842.556726179" lastFinishedPulling="2026-03-18 15:04:08.499662717 +0000 UTC m=+3849.814080692" observedRunningTime="2026-03-18 15:04:09.602347058 +0000 UTC m=+3850.916765033" watchObservedRunningTime="2026-03-18 15:04:09.604522517 +0000 UTC m=+3850.918940492" Mar 18 15:04:10 crc kubenswrapper[4756]: I0318 15:04:10.600218 4756 generic.go:334] "Generic (PLEG): container finished" podID="600f14cc-1ff0-45ca-88ff-8f1fd727609a" containerID="eceb29f19419aefc27b24dd5ee84d167b328f162fb7994519cb8c55dc6883c82" exitCode=0 Mar 18 15:04:10 crc kubenswrapper[4756]: I0318 15:04:10.600562 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564104-ph72j" event={"ID":"600f14cc-1ff0-45ca-88ff-8f1fd727609a","Type":"ContainerDied","Data":"eceb29f19419aefc27b24dd5ee84d167b328f162fb7994519cb8c55dc6883c82"} Mar 18 15:04:12 crc kubenswrapper[4756]: I0318 15:04:12.728840 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564104-ph72j" Mar 18 15:04:12 crc kubenswrapper[4756]: I0318 15:04:12.821283 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l75rq\" (UniqueName: \"kubernetes.io/projected/600f14cc-1ff0-45ca-88ff-8f1fd727609a-kube-api-access-l75rq\") pod \"600f14cc-1ff0-45ca-88ff-8f1fd727609a\" (UID: \"600f14cc-1ff0-45ca-88ff-8f1fd727609a\") " Mar 18 15:04:12 crc kubenswrapper[4756]: I0318 15:04:12.826897 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/600f14cc-1ff0-45ca-88ff-8f1fd727609a-kube-api-access-l75rq" (OuterVolumeSpecName: "kube-api-access-l75rq") pod "600f14cc-1ff0-45ca-88ff-8f1fd727609a" (UID: "600f14cc-1ff0-45ca-88ff-8f1fd727609a"). InnerVolumeSpecName "kube-api-access-l75rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:04:12 crc kubenswrapper[4756]: I0318 15:04:12.923957 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l75rq\" (UniqueName: \"kubernetes.io/projected/600f14cc-1ff0-45ca-88ff-8f1fd727609a-kube-api-access-l75rq\") on node \"crc\" DevicePath \"\"" Mar 18 15:04:13 crc kubenswrapper[4756]: I0318 15:04:13.626470 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564104-ph72j" event={"ID":"600f14cc-1ff0-45ca-88ff-8f1fd727609a","Type":"ContainerDied","Data":"4dcde130d1c0ef248f6e499449f5b08f3fecae7e347ef3403e8a229452976f94"} Mar 18 15:04:13 crc kubenswrapper[4756]: I0318 15:04:13.626511 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dcde130d1c0ef248f6e499449f5b08f3fecae7e347ef3403e8a229452976f94" Mar 18 15:04:13 crc kubenswrapper[4756]: I0318 15:04:13.626560 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564104-ph72j" Mar 18 15:04:13 crc kubenswrapper[4756]: I0318 15:04:13.795322 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564098-hrrsk"] Mar 18 15:04:13 crc kubenswrapper[4756]: I0318 15:04:13.805639 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564098-hrrsk"] Mar 18 15:04:15 crc kubenswrapper[4756]: I0318 15:04:15.327285 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51e5d910-0d11-4baa-987f-ffe6ba4e5e99" path="/var/lib/kubelet/pods/51e5d910-0d11-4baa-987f-ffe6ba4e5e99/volumes" Mar 18 15:04:35 crc kubenswrapper[4756]: I0318 15:04:35.626652 4756 scope.go:117] "RemoveContainer" containerID="f31486a4e7cc44f2f575797f0308b587f93c51618a81d609bc5a8256ab79c483" Mar 18 15:04:36 crc kubenswrapper[4756]: I0318 15:04:36.915299 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:04:36 crc kubenswrapper[4756]: I0318 15:04:36.915574 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:05:06 crc kubenswrapper[4756]: I0318 15:05:06.915554 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:05:06 crc kubenswrapper[4756]: I0318 15:05:06.917170 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:05:36 crc kubenswrapper[4756]: I0318 15:05:36.915205 4756 patch_prober.go:28] interesting pod/machine-config-daemon-qvpkg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 15:05:36 crc kubenswrapper[4756]: I0318 15:05:36.915805 4756 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 15:05:36 crc kubenswrapper[4756]: I0318 15:05:36.915859 4756 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" Mar 18 15:05:36 crc kubenswrapper[4756]: I0318 15:05:36.916747 4756 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4"} pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 15:05:36 crc kubenswrapper[4756]: I0318 15:05:36.916835 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerName="machine-config-daemon" containerID="cri-o://42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" gracePeriod=600 Mar 18 15:05:37 crc kubenswrapper[4756]: E0318 15:05:37.034779 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:05:37 crc kubenswrapper[4756]: I0318 15:05:37.465498 4756 generic.go:334] "Generic (PLEG): container finished" podID="c10ecbb9-ddab-48c7-9a86-abd122951622" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" exitCode=0 Mar 18 15:05:37 crc kubenswrapper[4756]: I0318 15:05:37.465564 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerDied","Data":"42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4"} Mar 18 15:05:37 crc kubenswrapper[4756]: I0318 15:05:37.465850 4756 scope.go:117] "RemoveContainer" containerID="1da14f46d57c4462e93c15657dccaca66ada6e735fdcca329fc0a7b2046da6cd" Mar 18 15:05:37 crc kubenswrapper[4756]: I0318 15:05:37.466586 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:05:37 crc kubenswrapper[4756]: E0318 15:05:37.466841 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:05:44 crc kubenswrapper[4756]: I0318 15:05:44.527728 4756 generic.go:334] "Generic (PLEG): container finished" podID="b8387997-ea84-4ab9-aecd-315e77e225e0" containerID="15934e12b5bda09c6103189b8477de1b9da42aea68338899bdb89c4830bc1b26" exitCode=0 Mar 18 15:05:44 crc kubenswrapper[4756]: I0318 15:05:44.528112 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m8n6h/must-gather-7s6gq" event={"ID":"b8387997-ea84-4ab9-aecd-315e77e225e0","Type":"ContainerDied","Data":"15934e12b5bda09c6103189b8477de1b9da42aea68338899bdb89c4830bc1b26"} Mar 18 15:05:44 crc kubenswrapper[4756]: I0318 15:05:44.529136 4756 scope.go:117] "RemoveContainer" containerID="15934e12b5bda09c6103189b8477de1b9da42aea68338899bdb89c4830bc1b26" Mar 18 15:05:45 crc kubenswrapper[4756]: I0318 15:05:45.345072 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m8n6h_must-gather-7s6gq_b8387997-ea84-4ab9-aecd-315e77e225e0/gather/0.log" Mar 18 15:05:49 crc kubenswrapper[4756]: I0318 15:05:49.322269 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:05:49 crc kubenswrapper[4756]: E0318 15:05:49.323361 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:05:55 crc kubenswrapper[4756]: I0318 15:05:55.439483 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m8n6h/must-gather-7s6gq"] Mar 18 15:05:55 crc kubenswrapper[4756]: I0318 15:05:55.440265 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-m8n6h/must-gather-7s6gq" podUID="b8387997-ea84-4ab9-aecd-315e77e225e0" containerName="copy" containerID="cri-o://bb3f759479d9429141ba68eda66e087e47db88d50f2abaaea32bda4171df6146" gracePeriod=2 Mar 18 15:05:55 crc kubenswrapper[4756]: I0318 15:05:55.450212 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m8n6h/must-gather-7s6gq"] Mar 18 15:05:55 crc kubenswrapper[4756]: I0318 15:05:55.672183 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m8n6h_must-gather-7s6gq_b8387997-ea84-4ab9-aecd-315e77e225e0/copy/0.log" Mar 18 15:05:55 crc kubenswrapper[4756]: I0318 15:05:55.675547 4756 generic.go:334] "Generic (PLEG): container finished" podID="b8387997-ea84-4ab9-aecd-315e77e225e0" containerID="bb3f759479d9429141ba68eda66e087e47db88d50f2abaaea32bda4171df6146" exitCode=143 Mar 18 15:05:56 crc kubenswrapper[4756]: I0318 15:05:56.549936 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m8n6h_must-gather-7s6gq_b8387997-ea84-4ab9-aecd-315e77e225e0/copy/0.log" Mar 18 15:05:56 crc kubenswrapper[4756]: I0318 15:05:56.550621 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/must-gather-7s6gq" Mar 18 15:05:56 crc kubenswrapper[4756]: I0318 15:05:56.661650 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8387997-ea84-4ab9-aecd-315e77e225e0-must-gather-output\") pod \"b8387997-ea84-4ab9-aecd-315e77e225e0\" (UID: \"b8387997-ea84-4ab9-aecd-315e77e225e0\") " Mar 18 15:05:56 crc kubenswrapper[4756]: I0318 15:05:56.661757 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5zv7\" (UniqueName: \"kubernetes.io/projected/b8387997-ea84-4ab9-aecd-315e77e225e0-kube-api-access-h5zv7\") pod \"b8387997-ea84-4ab9-aecd-315e77e225e0\" (UID: \"b8387997-ea84-4ab9-aecd-315e77e225e0\") " Mar 18 15:05:56 crc kubenswrapper[4756]: I0318 15:05:56.677319 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8387997-ea84-4ab9-aecd-315e77e225e0-kube-api-access-h5zv7" (OuterVolumeSpecName: "kube-api-access-h5zv7") pod "b8387997-ea84-4ab9-aecd-315e77e225e0" (UID: "b8387997-ea84-4ab9-aecd-315e77e225e0"). InnerVolumeSpecName "kube-api-access-h5zv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:05:56 crc kubenswrapper[4756]: I0318 15:05:56.687418 4756 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m8n6h_must-gather-7s6gq_b8387997-ea84-4ab9-aecd-315e77e225e0/copy/0.log" Mar 18 15:05:56 crc kubenswrapper[4756]: I0318 15:05:56.687836 4756 scope.go:117] "RemoveContainer" containerID="bb3f759479d9429141ba68eda66e087e47db88d50f2abaaea32bda4171df6146" Mar 18 15:05:56 crc kubenswrapper[4756]: I0318 15:05:56.687911 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m8n6h/must-gather-7s6gq" Mar 18 15:05:56 crc kubenswrapper[4756]: I0318 15:05:56.738399 4756 scope.go:117] "RemoveContainer" containerID="15934e12b5bda09c6103189b8477de1b9da42aea68338899bdb89c4830bc1b26" Mar 18 15:05:56 crc kubenswrapper[4756]: I0318 15:05:56.763720 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5zv7\" (UniqueName: \"kubernetes.io/projected/b8387997-ea84-4ab9-aecd-315e77e225e0-kube-api-access-h5zv7\") on node \"crc\" DevicePath \"\"" Mar 18 15:05:56 crc kubenswrapper[4756]: I0318 15:05:56.863428 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8387997-ea84-4ab9-aecd-315e77e225e0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b8387997-ea84-4ab9-aecd-315e77e225e0" (UID: "b8387997-ea84-4ab9-aecd-315e77e225e0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:05:56 crc kubenswrapper[4756]: I0318 15:05:56.867301 4756 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8387997-ea84-4ab9-aecd-315e77e225e0-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 15:05:57 crc kubenswrapper[4756]: I0318 15:05:57.326074 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8387997-ea84-4ab9-aecd-315e77e225e0" path="/var/lib/kubelet/pods/b8387997-ea84-4ab9-aecd-315e77e225e0/volumes" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.147042 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564106-fpg28"] Mar 18 15:06:00 crc kubenswrapper[4756]: E0318 15:06:00.148020 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8387997-ea84-4ab9-aecd-315e77e225e0" containerName="copy" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.148033 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8387997-ea84-4ab9-aecd-315e77e225e0" containerName="copy" Mar 18 15:06:00 crc kubenswrapper[4756]: E0318 15:06:00.148044 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="600f14cc-1ff0-45ca-88ff-8f1fd727609a" containerName="oc" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.148052 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="600f14cc-1ff0-45ca-88ff-8f1fd727609a" containerName="oc" Mar 18 15:06:00 crc kubenswrapper[4756]: E0318 15:06:00.148065 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8387997-ea84-4ab9-aecd-315e77e225e0" containerName="gather" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.148072 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8387997-ea84-4ab9-aecd-315e77e225e0" containerName="gather" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.148260 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8387997-ea84-4ab9-aecd-315e77e225e0" containerName="copy" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.148274 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="600f14cc-1ff0-45ca-88ff-8f1fd727609a" containerName="oc" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.148297 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8387997-ea84-4ab9-aecd-315e77e225e0" containerName="gather" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.149153 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564106-fpg28" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.158320 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.158561 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.158704 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.168037 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564106-fpg28"] Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.250750 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dp7g\" (UniqueName: \"kubernetes.io/projected/ac5b831d-a880-466f-894f-65eef589e1db-kube-api-access-4dp7g\") pod \"auto-csr-approver-29564106-fpg28\" (UID: \"ac5b831d-a880-466f-894f-65eef589e1db\") " pod="openshift-infra/auto-csr-approver-29564106-fpg28" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.352922 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dp7g\" (UniqueName: \"kubernetes.io/projected/ac5b831d-a880-466f-894f-65eef589e1db-kube-api-access-4dp7g\") pod \"auto-csr-approver-29564106-fpg28\" (UID: \"ac5b831d-a880-466f-894f-65eef589e1db\") " pod="openshift-infra/auto-csr-approver-29564106-fpg28" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.391797 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dp7g\" (UniqueName: \"kubernetes.io/projected/ac5b831d-a880-466f-894f-65eef589e1db-kube-api-access-4dp7g\") pod \"auto-csr-approver-29564106-fpg28\" (UID: \"ac5b831d-a880-466f-894f-65eef589e1db\") " pod="openshift-infra/auto-csr-approver-29564106-fpg28" Mar 18 15:06:00 crc kubenswrapper[4756]: I0318 15:06:00.481586 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564106-fpg28" Mar 18 15:06:01 crc kubenswrapper[4756]: I0318 15:06:01.239234 4756 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 15:06:01 crc kubenswrapper[4756]: I0318 15:06:01.239911 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564106-fpg28"] Mar 18 15:06:01 crc kubenswrapper[4756]: I0318 15:06:01.764980 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564106-fpg28" event={"ID":"ac5b831d-a880-466f-894f-65eef589e1db","Type":"ContainerStarted","Data":"f0c9918e021870c099f15cd82c901307dcaf783e9d6899d5eeb0b489ec017667"} Mar 18 15:06:03 crc kubenswrapper[4756]: I0318 15:06:03.783545 4756 generic.go:334] "Generic (PLEG): container finished" podID="ac5b831d-a880-466f-894f-65eef589e1db" containerID="9f85c46f1312e015ad0147c2af86665b6f207f17161ebd9ffdeba0826d5c4f92" exitCode=0 Mar 18 15:06:03 crc kubenswrapper[4756]: I0318 15:06:03.783661 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564106-fpg28" event={"ID":"ac5b831d-a880-466f-894f-65eef589e1db","Type":"ContainerDied","Data":"9f85c46f1312e015ad0147c2af86665b6f207f17161ebd9ffdeba0826d5c4f92"} Mar 18 15:06:04 crc kubenswrapper[4756]: I0318 15:06:04.315806 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:06:04 crc kubenswrapper[4756]: E0318 15:06:04.316473 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:06:05 crc kubenswrapper[4756]: I0318 15:06:05.642938 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564106-fpg28" Mar 18 15:06:05 crc kubenswrapper[4756]: I0318 15:06:05.766785 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dp7g\" (UniqueName: \"kubernetes.io/projected/ac5b831d-a880-466f-894f-65eef589e1db-kube-api-access-4dp7g\") pod \"ac5b831d-a880-466f-894f-65eef589e1db\" (UID: \"ac5b831d-a880-466f-894f-65eef589e1db\") " Mar 18 15:06:05 crc kubenswrapper[4756]: I0318 15:06:05.775267 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5b831d-a880-466f-894f-65eef589e1db-kube-api-access-4dp7g" (OuterVolumeSpecName: "kube-api-access-4dp7g") pod "ac5b831d-a880-466f-894f-65eef589e1db" (UID: "ac5b831d-a880-466f-894f-65eef589e1db"). InnerVolumeSpecName "kube-api-access-4dp7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:06:05 crc kubenswrapper[4756]: I0318 15:06:05.803837 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564106-fpg28" event={"ID":"ac5b831d-a880-466f-894f-65eef589e1db","Type":"ContainerDied","Data":"f0c9918e021870c099f15cd82c901307dcaf783e9d6899d5eeb0b489ec017667"} Mar 18 15:06:05 crc kubenswrapper[4756]: I0318 15:06:05.803892 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0c9918e021870c099f15cd82c901307dcaf783e9d6899d5eeb0b489ec017667" Mar 18 15:06:05 crc kubenswrapper[4756]: I0318 15:06:05.803897 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564106-fpg28" Mar 18 15:06:05 crc kubenswrapper[4756]: I0318 15:06:05.869299 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dp7g\" (UniqueName: \"kubernetes.io/projected/ac5b831d-a880-466f-894f-65eef589e1db-kube-api-access-4dp7g\") on node \"crc\" DevicePath \"\"" Mar 18 15:06:06 crc kubenswrapper[4756]: I0318 15:06:06.712706 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564100-92vdr"] Mar 18 15:06:06 crc kubenswrapper[4756]: I0318 15:06:06.742243 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564100-92vdr"] Mar 18 15:06:07 crc kubenswrapper[4756]: I0318 15:06:07.326872 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae5ca546-1126-4c61-a42d-c561dbb8490d" path="/var/lib/kubelet/pods/ae5ca546-1126-4c61-a42d-c561dbb8490d/volumes" Mar 18 15:06:17 crc kubenswrapper[4756]: I0318 15:06:17.323215 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:06:17 crc kubenswrapper[4756]: E0318 15:06:17.323893 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:06:30 crc kubenswrapper[4756]: I0318 15:06:30.315364 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:06:30 crc kubenswrapper[4756]: E0318 15:06:30.316054 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:06:35 crc kubenswrapper[4756]: I0318 15:06:35.725187 4756 scope.go:117] "RemoveContainer" containerID="5d10793298ced0ca9dca32db7e81398462a119a0f084762400fd3865129a7acc" Mar 18 15:06:43 crc kubenswrapper[4756]: I0318 15:06:43.316506 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:06:43 crc kubenswrapper[4756]: E0318 15:06:43.317186 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:06:58 crc kubenswrapper[4756]: I0318 15:06:58.315929 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:06:58 crc kubenswrapper[4756]: E0318 15:06:58.316758 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:07:12 crc kubenswrapper[4756]: I0318 15:07:12.316050 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:07:12 crc kubenswrapper[4756]: E0318 15:07:12.316949 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.129568 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-znvsh"] Mar 18 15:07:16 crc kubenswrapper[4756]: E0318 15:07:16.132988 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5b831d-a880-466f-894f-65eef589e1db" containerName="oc" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.133014 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5b831d-a880-466f-894f-65eef589e1db" containerName="oc" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.133341 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5b831d-a880-466f-894f-65eef589e1db" containerName="oc" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.135386 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.156455 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-znvsh"] Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.238652 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181ea872-4c6d-4778-a7da-909a8e63b0d1-catalog-content\") pod \"redhat-operators-znvsh\" (UID: \"181ea872-4c6d-4778-a7da-909a8e63b0d1\") " pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.238848 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkjt4\" (UniqueName: \"kubernetes.io/projected/181ea872-4c6d-4778-a7da-909a8e63b0d1-kube-api-access-hkjt4\") pod \"redhat-operators-znvsh\" (UID: \"181ea872-4c6d-4778-a7da-909a8e63b0d1\") " pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.238918 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181ea872-4c6d-4778-a7da-909a8e63b0d1-utilities\") pod \"redhat-operators-znvsh\" (UID: \"181ea872-4c6d-4778-a7da-909a8e63b0d1\") " pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.341604 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjt4\" (UniqueName: \"kubernetes.io/projected/181ea872-4c6d-4778-a7da-909a8e63b0d1-kube-api-access-hkjt4\") pod \"redhat-operators-znvsh\" (UID: \"181ea872-4c6d-4778-a7da-909a8e63b0d1\") " pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.341749 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181ea872-4c6d-4778-a7da-909a8e63b0d1-utilities\") pod \"redhat-operators-znvsh\" (UID: \"181ea872-4c6d-4778-a7da-909a8e63b0d1\") " pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.341840 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181ea872-4c6d-4778-a7da-909a8e63b0d1-catalog-content\") pod \"redhat-operators-znvsh\" (UID: \"181ea872-4c6d-4778-a7da-909a8e63b0d1\") " pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.342487 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181ea872-4c6d-4778-a7da-909a8e63b0d1-catalog-content\") pod \"redhat-operators-znvsh\" (UID: \"181ea872-4c6d-4778-a7da-909a8e63b0d1\") " pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.345480 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181ea872-4c6d-4778-a7da-909a8e63b0d1-utilities\") pod \"redhat-operators-znvsh\" (UID: \"181ea872-4c6d-4778-a7da-909a8e63b0d1\") " pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.366241 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkjt4\" (UniqueName: \"kubernetes.io/projected/181ea872-4c6d-4778-a7da-909a8e63b0d1-kube-api-access-hkjt4\") pod \"redhat-operators-znvsh\" (UID: \"181ea872-4c6d-4778-a7da-909a8e63b0d1\") " pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:16 crc kubenswrapper[4756]: I0318 15:07:16.461212 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:17 crc kubenswrapper[4756]: W0318 15:07:17.254908 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod181ea872_4c6d_4778_a7da_909a8e63b0d1.slice/crio-eba41c93d525f3db0a8fe50895abf6615b70a1aafd3f9f0dd2e8f3ead895a736 WatchSource:0}: Error finding container eba41c93d525f3db0a8fe50895abf6615b70a1aafd3f9f0dd2e8f3ead895a736: Status 404 returned error can't find the container with id eba41c93d525f3db0a8fe50895abf6615b70a1aafd3f9f0dd2e8f3ead895a736 Mar 18 15:07:17 crc kubenswrapper[4756]: I0318 15:07:17.258980 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-znvsh"] Mar 18 15:07:17 crc kubenswrapper[4756]: I0318 15:07:17.457276 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znvsh" event={"ID":"181ea872-4c6d-4778-a7da-909a8e63b0d1","Type":"ContainerStarted","Data":"eba41c93d525f3db0a8fe50895abf6615b70a1aafd3f9f0dd2e8f3ead895a736"} Mar 18 15:07:18 crc kubenswrapper[4756]: I0318 15:07:18.468278 4756 generic.go:334] "Generic (PLEG): container finished" podID="181ea872-4c6d-4778-a7da-909a8e63b0d1" containerID="7e560137c9d23fbadd287207ad3745d78bb33c05e2a9a80874e53601e9e9745c" exitCode=0 Mar 18 15:07:18 crc kubenswrapper[4756]: I0318 15:07:18.468354 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znvsh" event={"ID":"181ea872-4c6d-4778-a7da-909a8e63b0d1","Type":"ContainerDied","Data":"7e560137c9d23fbadd287207ad3745d78bb33c05e2a9a80874e53601e9e9745c"} Mar 18 15:07:20 crc kubenswrapper[4756]: I0318 15:07:20.490820 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znvsh" event={"ID":"181ea872-4c6d-4778-a7da-909a8e63b0d1","Type":"ContainerStarted","Data":"2973de7265fdac569779e3a21883edfceff5c7758bd0952f03cf6b443498d5db"} Mar 18 15:07:25 crc kubenswrapper[4756]: I0318 15:07:25.316789 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:07:25 crc kubenswrapper[4756]: E0318 15:07:25.317424 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:07:26 crc kubenswrapper[4756]: I0318 15:07:26.543313 4756 generic.go:334] "Generic (PLEG): container finished" podID="181ea872-4c6d-4778-a7da-909a8e63b0d1" containerID="2973de7265fdac569779e3a21883edfceff5c7758bd0952f03cf6b443498d5db" exitCode=0 Mar 18 15:07:26 crc kubenswrapper[4756]: I0318 15:07:26.543386 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znvsh" event={"ID":"181ea872-4c6d-4778-a7da-909a8e63b0d1","Type":"ContainerDied","Data":"2973de7265fdac569779e3a21883edfceff5c7758bd0952f03cf6b443498d5db"} Mar 18 15:07:27 crc kubenswrapper[4756]: I0318 15:07:27.556453 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znvsh" event={"ID":"181ea872-4c6d-4778-a7da-909a8e63b0d1","Type":"ContainerStarted","Data":"8642b33535abf497195ecc4da4d0e24736b15acf1545c407a90f3b60bac9d68b"} Mar 18 15:07:27 crc kubenswrapper[4756]: I0318 15:07:27.576256 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-znvsh" podStartSLOduration=2.942874336 podStartE2EDuration="11.576238979s" podCreationTimestamp="2026-03-18 15:07:16 +0000 UTC" firstStartedPulling="2026-03-18 15:07:18.47046276 +0000 UTC m=+4039.784880735" lastFinishedPulling="2026-03-18 15:07:27.103827403 +0000 UTC m=+4048.418245378" observedRunningTime="2026-03-18 15:07:27.575649333 +0000 UTC m=+4048.890067318" watchObservedRunningTime="2026-03-18 15:07:27.576238979 +0000 UTC m=+4048.890656954" Mar 18 15:07:36 crc kubenswrapper[4756]: I0318 15:07:36.461461 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:36 crc kubenswrapper[4756]: I0318 15:07:36.462048 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:37 crc kubenswrapper[4756]: I0318 15:07:37.507725 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-znvsh" podUID="181ea872-4c6d-4778-a7da-909a8e63b0d1" containerName="registry-server" probeResult="failure" output=< Mar 18 15:07:37 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 15:07:37 crc kubenswrapper[4756]: > Mar 18 15:07:40 crc kubenswrapper[4756]: I0318 15:07:40.316419 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:07:40 crc kubenswrapper[4756]: E0318 15:07:40.317294 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:07:47 crc kubenswrapper[4756]: I0318 15:07:47.515837 4756 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-znvsh" podUID="181ea872-4c6d-4778-a7da-909a8e63b0d1" containerName="registry-server" probeResult="failure" output=< Mar 18 15:07:47 crc kubenswrapper[4756]: timeout: failed to connect service ":50051" within 1s Mar 18 15:07:47 crc kubenswrapper[4756]: > Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.114677 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q65jc"] Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.117384 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.126314 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q65jc"] Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.158638 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-utilities\") pod \"community-operators-q65jc\" (UID: \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\") " pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.158696 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgmrk\" (UniqueName: \"kubernetes.io/projected/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-kube-api-access-tgmrk\") pod \"community-operators-q65jc\" (UID: \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\") " pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.158724 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-catalog-content\") pod \"community-operators-q65jc\" (UID: \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\") " pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.260139 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-utilities\") pod \"community-operators-q65jc\" (UID: \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\") " pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.260440 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgmrk\" (UniqueName: \"kubernetes.io/projected/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-kube-api-access-tgmrk\") pod \"community-operators-q65jc\" (UID: \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\") " pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.260554 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-catalog-content\") pod \"community-operators-q65jc\" (UID: \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\") " pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.260629 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-utilities\") pod \"community-operators-q65jc\" (UID: \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\") " pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.260930 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-catalog-content\") pod \"community-operators-q65jc\" (UID: \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\") " pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.287041 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgmrk\" (UniqueName: \"kubernetes.io/projected/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-kube-api-access-tgmrk\") pod \"community-operators-q65jc\" (UID: \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\") " pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.315553 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:07:54 crc kubenswrapper[4756]: E0318 15:07:54.315872 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:07:54 crc kubenswrapper[4756]: I0318 15:07:54.435733 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:07:55 crc kubenswrapper[4756]: I0318 15:07:55.210756 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q65jc"] Mar 18 15:07:55 crc kubenswrapper[4756]: I0318 15:07:55.812979 4756 generic.go:334] "Generic (PLEG): container finished" podID="89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" containerID="39d4332099d31807e02043126d22f2e431437db3b705c587a4b12f0b42482437" exitCode=0 Mar 18 15:07:55 crc kubenswrapper[4756]: I0318 15:07:55.813300 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jc" event={"ID":"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb","Type":"ContainerDied","Data":"39d4332099d31807e02043126d22f2e431437db3b705c587a4b12f0b42482437"} Mar 18 15:07:55 crc kubenswrapper[4756]: I0318 15:07:55.813332 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jc" event={"ID":"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb","Type":"ContainerStarted","Data":"c27003a9e9fa660a971f07e768a90ae25781b9299d2fed696778a08ffde0fcbc"} Mar 18 15:07:56 crc kubenswrapper[4756]: I0318 15:07:56.520829 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:56 crc kubenswrapper[4756]: I0318 15:07:56.573987 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:07:57 crc kubenswrapper[4756]: I0318 15:07:57.831720 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jc" event={"ID":"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb","Type":"ContainerStarted","Data":"9b4507fdb53cb379ed4392987f34bbd0e4b317100a106ed61f619efae130df68"} Mar 18 15:07:58 crc kubenswrapper[4756]: I0318 15:07:58.894814 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-znvsh"] Mar 18 15:07:58 crc kubenswrapper[4756]: I0318 15:07:58.895292 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-znvsh" podUID="181ea872-4c6d-4778-a7da-909a8e63b0d1" containerName="registry-server" containerID="cri-o://8642b33535abf497195ecc4da4d0e24736b15acf1545c407a90f3b60bac9d68b" gracePeriod=2 Mar 18 15:07:59 crc kubenswrapper[4756]: I0318 15:07:59.858949 4756 generic.go:334] "Generic (PLEG): container finished" podID="89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" containerID="9b4507fdb53cb379ed4392987f34bbd0e4b317100a106ed61f619efae130df68" exitCode=0 Mar 18 15:07:59 crc kubenswrapper[4756]: I0318 15:07:59.859359 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jc" event={"ID":"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb","Type":"ContainerDied","Data":"9b4507fdb53cb379ed4392987f34bbd0e4b317100a106ed61f619efae130df68"} Mar 18 15:07:59 crc kubenswrapper[4756]: I0318 15:07:59.865943 4756 generic.go:334] "Generic (PLEG): container finished" podID="181ea872-4c6d-4778-a7da-909a8e63b0d1" containerID="8642b33535abf497195ecc4da4d0e24736b15acf1545c407a90f3b60bac9d68b" exitCode=0 Mar 18 15:07:59 crc kubenswrapper[4756]: I0318 15:07:59.866011 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znvsh" event={"ID":"181ea872-4c6d-4778-a7da-909a8e63b0d1","Type":"ContainerDied","Data":"8642b33535abf497195ecc4da4d0e24736b15acf1545c407a90f3b60bac9d68b"} Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.152229 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564108-bndk7"] Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.153716 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564108-bndk7" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.156257 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.156465 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.156619 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.164325 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564108-bndk7"] Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.305641 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppg7t\" (UniqueName: \"kubernetes.io/projected/f9a7cf87-c110-449b-8aea-cb9cb268e403-kube-api-access-ppg7t\") pod \"auto-csr-approver-29564108-bndk7\" (UID: \"f9a7cf87-c110-449b-8aea-cb9cb268e403\") " pod="openshift-infra/auto-csr-approver-29564108-bndk7" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.322826 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.406879 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkjt4\" (UniqueName: \"kubernetes.io/projected/181ea872-4c6d-4778-a7da-909a8e63b0d1-kube-api-access-hkjt4\") pod \"181ea872-4c6d-4778-a7da-909a8e63b0d1\" (UID: \"181ea872-4c6d-4778-a7da-909a8e63b0d1\") " Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.407121 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181ea872-4c6d-4778-a7da-909a8e63b0d1-utilities\") pod \"181ea872-4c6d-4778-a7da-909a8e63b0d1\" (UID: \"181ea872-4c6d-4778-a7da-909a8e63b0d1\") " Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.407184 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181ea872-4c6d-4778-a7da-909a8e63b0d1-catalog-content\") pod \"181ea872-4c6d-4778-a7da-909a8e63b0d1\" (UID: \"181ea872-4c6d-4778-a7da-909a8e63b0d1\") " Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.407416 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppg7t\" (UniqueName: \"kubernetes.io/projected/f9a7cf87-c110-449b-8aea-cb9cb268e403-kube-api-access-ppg7t\") pod \"auto-csr-approver-29564108-bndk7\" (UID: \"f9a7cf87-c110-449b-8aea-cb9cb268e403\") " pod="openshift-infra/auto-csr-approver-29564108-bndk7" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.410442 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/181ea872-4c6d-4778-a7da-909a8e63b0d1-utilities" (OuterVolumeSpecName: "utilities") pod "181ea872-4c6d-4778-a7da-909a8e63b0d1" (UID: "181ea872-4c6d-4778-a7da-909a8e63b0d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.413616 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181ea872-4c6d-4778-a7da-909a8e63b0d1-kube-api-access-hkjt4" (OuterVolumeSpecName: "kube-api-access-hkjt4") pod "181ea872-4c6d-4778-a7da-909a8e63b0d1" (UID: "181ea872-4c6d-4778-a7da-909a8e63b0d1"). InnerVolumeSpecName "kube-api-access-hkjt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.425803 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppg7t\" (UniqueName: \"kubernetes.io/projected/f9a7cf87-c110-449b-8aea-cb9cb268e403-kube-api-access-ppg7t\") pod \"auto-csr-approver-29564108-bndk7\" (UID: \"f9a7cf87-c110-449b-8aea-cb9cb268e403\") " pod="openshift-infra/auto-csr-approver-29564108-bndk7" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.483725 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564108-bndk7" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.515556 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkjt4\" (UniqueName: \"kubernetes.io/projected/181ea872-4c6d-4778-a7da-909a8e63b0d1-kube-api-access-hkjt4\") on node \"crc\" DevicePath \"\"" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.515591 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181ea872-4c6d-4778-a7da-909a8e63b0d1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.576990 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/181ea872-4c6d-4778-a7da-909a8e63b0d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "181ea872-4c6d-4778-a7da-909a8e63b0d1" (UID: "181ea872-4c6d-4778-a7da-909a8e63b0d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.616999 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181ea872-4c6d-4778-a7da-909a8e63b0d1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.901435 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znvsh" event={"ID":"181ea872-4c6d-4778-a7da-909a8e63b0d1","Type":"ContainerDied","Data":"eba41c93d525f3db0a8fe50895abf6615b70a1aafd3f9f0dd2e8f3ead895a736"} Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.901499 4756 scope.go:117] "RemoveContainer" containerID="8642b33535abf497195ecc4da4d0e24736b15acf1545c407a90f3b60bac9d68b" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.901706 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-znvsh" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.909602 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jc" event={"ID":"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb","Type":"ContainerStarted","Data":"854333e6abebe1fa5810a76a70d628b10724a5301122a4a1c0b8725e16797616"} Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.948284 4756 scope.go:117] "RemoveContainer" containerID="2973de7265fdac569779e3a21883edfceff5c7758bd0952f03cf6b443498d5db" Mar 18 15:08:00 crc kubenswrapper[4756]: I0318 15:08:00.994310 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q65jc" podStartSLOduration=2.441843541 podStartE2EDuration="6.994288779s" podCreationTimestamp="2026-03-18 15:07:54 +0000 UTC" firstStartedPulling="2026-03-18 15:07:55.81503384 +0000 UTC m=+4077.129451815" lastFinishedPulling="2026-03-18 15:08:00.367479078 +0000 UTC m=+4081.681897053" observedRunningTime="2026-03-18 15:08:00.946432711 +0000 UTC m=+4082.260850696" watchObservedRunningTime="2026-03-18 15:08:00.994288779 +0000 UTC m=+4082.308706754" Mar 18 15:08:01 crc kubenswrapper[4756]: I0318 15:08:01.000367 4756 scope.go:117] "RemoveContainer" containerID="7e560137c9d23fbadd287207ad3745d78bb33c05e2a9a80874e53601e9e9745c" Mar 18 15:08:01 crc kubenswrapper[4756]: I0318 15:08:01.003268 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-znvsh"] Mar 18 15:08:01 crc kubenswrapper[4756]: I0318 15:08:01.015213 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-znvsh"] Mar 18 15:08:01 crc kubenswrapper[4756]: W0318 15:08:01.243602 4756 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9a7cf87_c110_449b_8aea_cb9cb268e403.slice/crio-378cd75129094a1569338af964c8250c984a47d59c4ca2b1ba862f2e942beed0 WatchSource:0}: Error finding container 378cd75129094a1569338af964c8250c984a47d59c4ca2b1ba862f2e942beed0: Status 404 returned error can't find the container with id 378cd75129094a1569338af964c8250c984a47d59c4ca2b1ba862f2e942beed0 Mar 18 15:08:01 crc kubenswrapper[4756]: I0318 15:08:01.246779 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564108-bndk7"] Mar 18 15:08:01 crc kubenswrapper[4756]: I0318 15:08:01.328109 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181ea872-4c6d-4778-a7da-909a8e63b0d1" path="/var/lib/kubelet/pods/181ea872-4c6d-4778-a7da-909a8e63b0d1/volumes" Mar 18 15:08:01 crc kubenswrapper[4756]: I0318 15:08:01.918749 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564108-bndk7" event={"ID":"f9a7cf87-c110-449b-8aea-cb9cb268e403","Type":"ContainerStarted","Data":"378cd75129094a1569338af964c8250c984a47d59c4ca2b1ba862f2e942beed0"} Mar 18 15:08:02 crc kubenswrapper[4756]: I0318 15:08:02.929551 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564108-bndk7" event={"ID":"f9a7cf87-c110-449b-8aea-cb9cb268e403","Type":"ContainerStarted","Data":"7ac63b6564e13b8f16d7d29b665f772132605b0d5bd1105c085beb7b8d35ffd9"} Mar 18 15:08:02 crc kubenswrapper[4756]: I0318 15:08:02.952337 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564108-bndk7" podStartSLOduration=2.0733918 podStartE2EDuration="2.952317397s" podCreationTimestamp="2026-03-18 15:08:00 +0000 UTC" firstStartedPulling="2026-03-18 15:08:01.245708324 +0000 UTC m=+4082.560126299" lastFinishedPulling="2026-03-18 15:08:02.124633921 +0000 UTC m=+4083.439051896" observedRunningTime="2026-03-18 15:08:02.945162754 +0000 UTC m=+4084.259580729" watchObservedRunningTime="2026-03-18 15:08:02.952317397 +0000 UTC m=+4084.266735362" Mar 18 15:08:03 crc kubenswrapper[4756]: I0318 15:08:03.939312 4756 generic.go:334] "Generic (PLEG): container finished" podID="f9a7cf87-c110-449b-8aea-cb9cb268e403" containerID="7ac63b6564e13b8f16d7d29b665f772132605b0d5bd1105c085beb7b8d35ffd9" exitCode=0 Mar 18 15:08:03 crc kubenswrapper[4756]: I0318 15:08:03.939389 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564108-bndk7" event={"ID":"f9a7cf87-c110-449b-8aea-cb9cb268e403","Type":"ContainerDied","Data":"7ac63b6564e13b8f16d7d29b665f772132605b0d5bd1105c085beb7b8d35ffd9"} Mar 18 15:08:04 crc kubenswrapper[4756]: I0318 15:08:04.436823 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:08:04 crc kubenswrapper[4756]: I0318 15:08:04.437140 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:08:04 crc kubenswrapper[4756]: I0318 15:08:04.485111 4756 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:08:06 crc kubenswrapper[4756]: I0318 15:08:06.009361 4756 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:08:06 crc kubenswrapper[4756]: I0318 15:08:06.316880 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:08:06 crc kubenswrapper[4756]: E0318 15:08:06.317741 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:08:06 crc kubenswrapper[4756]: I0318 15:08:06.684855 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q65jc"] Mar 18 15:08:06 crc kubenswrapper[4756]: I0318 15:08:06.727979 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564108-bndk7" Mar 18 15:08:06 crc kubenswrapper[4756]: I0318 15:08:06.883971 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppg7t\" (UniqueName: \"kubernetes.io/projected/f9a7cf87-c110-449b-8aea-cb9cb268e403-kube-api-access-ppg7t\") pod \"f9a7cf87-c110-449b-8aea-cb9cb268e403\" (UID: \"f9a7cf87-c110-449b-8aea-cb9cb268e403\") " Mar 18 15:08:06 crc kubenswrapper[4756]: I0318 15:08:06.891494 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a7cf87-c110-449b-8aea-cb9cb268e403-kube-api-access-ppg7t" (OuterVolumeSpecName: "kube-api-access-ppg7t") pod "f9a7cf87-c110-449b-8aea-cb9cb268e403" (UID: "f9a7cf87-c110-449b-8aea-cb9cb268e403"). InnerVolumeSpecName "kube-api-access-ppg7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:08:06 crc kubenswrapper[4756]: I0318 15:08:06.970724 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564108-bndk7" event={"ID":"f9a7cf87-c110-449b-8aea-cb9cb268e403","Type":"ContainerDied","Data":"378cd75129094a1569338af964c8250c984a47d59c4ca2b1ba862f2e942beed0"} Mar 18 15:08:06 crc kubenswrapper[4756]: I0318 15:08:06.970782 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="378cd75129094a1569338af964c8250c984a47d59c4ca2b1ba862f2e942beed0" Mar 18 15:08:06 crc kubenswrapper[4756]: I0318 15:08:06.970788 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564108-bndk7" Mar 18 15:08:06 crc kubenswrapper[4756]: I0318 15:08:06.986332 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppg7t\" (UniqueName: \"kubernetes.io/projected/f9a7cf87-c110-449b-8aea-cb9cb268e403-kube-api-access-ppg7t\") on node \"crc\" DevicePath \"\"" Mar 18 15:08:07 crc kubenswrapper[4756]: I0318 15:08:07.788993 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564102-ddts6"] Mar 18 15:08:07 crc kubenswrapper[4756]: I0318 15:08:07.801264 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564102-ddts6"] Mar 18 15:08:07 crc kubenswrapper[4756]: I0318 15:08:07.984006 4756 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q65jc" podUID="89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" containerName="registry-server" containerID="cri-o://854333e6abebe1fa5810a76a70d628b10724a5301122a4a1c0b8725e16797616" gracePeriod=2 Mar 18 15:08:08 crc kubenswrapper[4756]: I0318 15:08:08.995856 4756 generic.go:334] "Generic (PLEG): container finished" podID="89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" containerID="854333e6abebe1fa5810a76a70d628b10724a5301122a4a1c0b8725e16797616" exitCode=0 Mar 18 15:08:08 crc kubenswrapper[4756]: I0318 15:08:08.995942 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jc" event={"ID":"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb","Type":"ContainerDied","Data":"854333e6abebe1fa5810a76a70d628b10724a5301122a4a1c0b8725e16797616"} Mar 18 15:08:09 crc kubenswrapper[4756]: I0318 15:08:09.327693 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4599defb-1667-4008-8eed-a68665159cac" path="/var/lib/kubelet/pods/4599defb-1667-4008-8eed-a68665159cac/volumes" Mar 18 15:08:10 crc kubenswrapper[4756]: I0318 15:08:10.217275 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:08:10 crc kubenswrapper[4756]: I0318 15:08:10.348281 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-catalog-content\") pod \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\" (UID: \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\") " Mar 18 15:08:10 crc kubenswrapper[4756]: I0318 15:08:10.348420 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgmrk\" (UniqueName: \"kubernetes.io/projected/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-kube-api-access-tgmrk\") pod \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\" (UID: \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\") " Mar 18 15:08:10 crc kubenswrapper[4756]: I0318 15:08:10.348624 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-utilities\") pod \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\" (UID: \"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb\") " Mar 18 15:08:10 crc kubenswrapper[4756]: I0318 15:08:10.349712 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-utilities" (OuterVolumeSpecName: "utilities") pod "89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" (UID: "89a5e24f-5fc8-419b-ba3f-28d5ddf93adb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:08:10 crc kubenswrapper[4756]: I0318 15:08:10.354704 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-kube-api-access-tgmrk" (OuterVolumeSpecName: "kube-api-access-tgmrk") pod "89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" (UID: "89a5e24f-5fc8-419b-ba3f-28d5ddf93adb"). InnerVolumeSpecName "kube-api-access-tgmrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:08:10 crc kubenswrapper[4756]: I0318 15:08:10.410392 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" (UID: "89a5e24f-5fc8-419b-ba3f-28d5ddf93adb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 15:08:10 crc kubenswrapper[4756]: I0318 15:08:10.451348 4756 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 15:08:10 crc kubenswrapper[4756]: I0318 15:08:10.451378 4756 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 15:08:10 crc kubenswrapper[4756]: I0318 15:08:10.451388 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgmrk\" (UniqueName: \"kubernetes.io/projected/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb-kube-api-access-tgmrk\") on node \"crc\" DevicePath \"\"" Mar 18 15:08:11 crc kubenswrapper[4756]: I0318 15:08:11.016930 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jc" event={"ID":"89a5e24f-5fc8-419b-ba3f-28d5ddf93adb","Type":"ContainerDied","Data":"c27003a9e9fa660a971f07e768a90ae25781b9299d2fed696778a08ffde0fcbc"} Mar 18 15:08:11 crc kubenswrapper[4756]: I0318 15:08:11.016993 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q65jc" Mar 18 15:08:11 crc kubenswrapper[4756]: I0318 15:08:11.017278 4756 scope.go:117] "RemoveContainer" containerID="854333e6abebe1fa5810a76a70d628b10724a5301122a4a1c0b8725e16797616" Mar 18 15:08:11 crc kubenswrapper[4756]: I0318 15:08:11.048396 4756 scope.go:117] "RemoveContainer" containerID="9b4507fdb53cb379ed4392987f34bbd0e4b317100a106ed61f619efae130df68" Mar 18 15:08:11 crc kubenswrapper[4756]: I0318 15:08:11.051649 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q65jc"] Mar 18 15:08:11 crc kubenswrapper[4756]: I0318 15:08:11.065366 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q65jc"] Mar 18 15:08:11 crc kubenswrapper[4756]: I0318 15:08:11.079931 4756 scope.go:117] "RemoveContainer" containerID="39d4332099d31807e02043126d22f2e431437db3b705c587a4b12f0b42482437" Mar 18 15:08:11 crc kubenswrapper[4756]: I0318 15:08:11.329398 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" path="/var/lib/kubelet/pods/89a5e24f-5fc8-419b-ba3f-28d5ddf93adb/volumes" Mar 18 15:08:17 crc kubenswrapper[4756]: I0318 15:08:17.316281 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:08:17 crc kubenswrapper[4756]: E0318 15:08:17.317211 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:08:32 crc kubenswrapper[4756]: I0318 15:08:32.316217 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:08:32 crc kubenswrapper[4756]: E0318 15:08:32.316905 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:08:35 crc kubenswrapper[4756]: I0318 15:08:35.822246 4756 scope.go:117] "RemoveContainer" containerID="be48b0ceea894b98ef1b989c5d5ea085d855db986d656f8222a6efa420786932" Mar 18 15:08:46 crc kubenswrapper[4756]: I0318 15:08:46.315512 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:08:46 crc kubenswrapper[4756]: E0318 15:08:46.316228 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:09:01 crc kubenswrapper[4756]: I0318 15:09:01.315171 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:09:01 crc kubenswrapper[4756]: E0318 15:09:01.315868 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:09:13 crc kubenswrapper[4756]: I0318 15:09:13.316285 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:09:13 crc kubenswrapper[4756]: E0318 15:09:13.317278 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:09:24 crc kubenswrapper[4756]: I0318 15:09:24.316083 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:09:24 crc kubenswrapper[4756]: E0318 15:09:24.316993 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:09:35 crc kubenswrapper[4756]: I0318 15:09:35.315078 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:09:35 crc kubenswrapper[4756]: E0318 15:09:35.315896 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:09:46 crc kubenswrapper[4756]: I0318 15:09:46.315399 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:09:46 crc kubenswrapper[4756]: E0318 15:09:46.316159 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:09:59 crc kubenswrapper[4756]: I0318 15:09:59.330919 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:09:59 crc kubenswrapper[4756]: E0318 15:09:59.331776 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.147853 4756 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564110-tx6fp"] Mar 18 15:10:00 crc kubenswrapper[4756]: E0318 15:10:00.148869 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a7cf87-c110-449b-8aea-cb9cb268e403" containerName="oc" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.148889 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a7cf87-c110-449b-8aea-cb9cb268e403" containerName="oc" Mar 18 15:10:00 crc kubenswrapper[4756]: E0318 15:10:00.148900 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" containerName="registry-server" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.148907 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" containerName="registry-server" Mar 18 15:10:00 crc kubenswrapper[4756]: E0318 15:10:00.148928 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" containerName="extract-utilities" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.148936 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" containerName="extract-utilities" Mar 18 15:10:00 crc kubenswrapper[4756]: E0318 15:10:00.148948 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181ea872-4c6d-4778-a7da-909a8e63b0d1" containerName="extract-content" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.148955 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="181ea872-4c6d-4778-a7da-909a8e63b0d1" containerName="extract-content" Mar 18 15:10:00 crc kubenswrapper[4756]: E0318 15:10:00.148962 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" containerName="extract-content" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.148968 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" containerName="extract-content" Mar 18 15:10:00 crc kubenswrapper[4756]: E0318 15:10:00.148982 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181ea872-4c6d-4778-a7da-909a8e63b0d1" containerName="extract-utilities" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.148988 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="181ea872-4c6d-4778-a7da-909a8e63b0d1" containerName="extract-utilities" Mar 18 15:10:00 crc kubenswrapper[4756]: E0318 15:10:00.149008 4756 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181ea872-4c6d-4778-a7da-909a8e63b0d1" containerName="registry-server" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.149014 4756 state_mem.go:107] "Deleted CPUSet assignment" podUID="181ea872-4c6d-4778-a7da-909a8e63b0d1" containerName="registry-server" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.149221 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a7cf87-c110-449b-8aea-cb9cb268e403" containerName="oc" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.149245 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a5e24f-5fc8-419b-ba3f-28d5ddf93adb" containerName="registry-server" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.149256 4756 memory_manager.go:354] "RemoveStaleState removing state" podUID="181ea872-4c6d-4778-a7da-909a8e63b0d1" containerName="registry-server" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.149938 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564110-tx6fp" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.151730 4756 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b4qqx" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.152132 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.152322 4756 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.158316 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564110-tx6fp"] Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.241713 4756 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h69sc\" (UniqueName: \"kubernetes.io/projected/bebefec0-9833-485b-9bb5-003a6faa22e2-kube-api-access-h69sc\") pod \"auto-csr-approver-29564110-tx6fp\" (UID: \"bebefec0-9833-485b-9bb5-003a6faa22e2\") " pod="openshift-infra/auto-csr-approver-29564110-tx6fp" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.343481 4756 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h69sc\" (UniqueName: \"kubernetes.io/projected/bebefec0-9833-485b-9bb5-003a6faa22e2-kube-api-access-h69sc\") pod \"auto-csr-approver-29564110-tx6fp\" (UID: \"bebefec0-9833-485b-9bb5-003a6faa22e2\") " pod="openshift-infra/auto-csr-approver-29564110-tx6fp" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.363856 4756 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h69sc\" (UniqueName: \"kubernetes.io/projected/bebefec0-9833-485b-9bb5-003a6faa22e2-kube-api-access-h69sc\") pod \"auto-csr-approver-29564110-tx6fp\" (UID: \"bebefec0-9833-485b-9bb5-003a6faa22e2\") " pod="openshift-infra/auto-csr-approver-29564110-tx6fp" Mar 18 15:10:00 crc kubenswrapper[4756]: I0318 15:10:00.476925 4756 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564110-tx6fp" Mar 18 15:10:01 crc kubenswrapper[4756]: I0318 15:10:01.139980 4756 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564110-tx6fp"] Mar 18 15:10:02 crc kubenswrapper[4756]: I0318 15:10:02.084060 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564110-tx6fp" event={"ID":"bebefec0-9833-485b-9bb5-003a6faa22e2","Type":"ContainerStarted","Data":"fff5a8975f47b768ea23a1397c1e9daa834837df28ff1688fc25d775d0702911"} Mar 18 15:10:03 crc kubenswrapper[4756]: I0318 15:10:03.098665 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564110-tx6fp" event={"ID":"bebefec0-9833-485b-9bb5-003a6faa22e2","Type":"ContainerStarted","Data":"65f2a5fea0e2fb88bcbc9bf0f9d0f574df0da49c14f4b028162872d60d614e1e"} Mar 18 15:10:03 crc kubenswrapper[4756]: I0318 15:10:03.118175 4756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564110-tx6fp" podStartSLOduration=1.5603393429999999 podStartE2EDuration="3.118153892s" podCreationTimestamp="2026-03-18 15:10:00 +0000 UTC" firstStartedPulling="2026-03-18 15:10:01.144285252 +0000 UTC m=+4202.458703227" lastFinishedPulling="2026-03-18 15:10:02.702099801 +0000 UTC m=+4204.016517776" observedRunningTime="2026-03-18 15:10:03.110846905 +0000 UTC m=+4204.425264890" watchObservedRunningTime="2026-03-18 15:10:03.118153892 +0000 UTC m=+4204.432571877" Mar 18 15:10:04 crc kubenswrapper[4756]: I0318 15:10:04.110376 4756 generic.go:334] "Generic (PLEG): container finished" podID="bebefec0-9833-485b-9bb5-003a6faa22e2" containerID="65f2a5fea0e2fb88bcbc9bf0f9d0f574df0da49c14f4b028162872d60d614e1e" exitCode=0 Mar 18 15:10:04 crc kubenswrapper[4756]: I0318 15:10:04.110435 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564110-tx6fp" event={"ID":"bebefec0-9833-485b-9bb5-003a6faa22e2","Type":"ContainerDied","Data":"65f2a5fea0e2fb88bcbc9bf0f9d0f574df0da49c14f4b028162872d60d614e1e"} Mar 18 15:10:06 crc kubenswrapper[4756]: I0318 15:10:06.130478 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564110-tx6fp" event={"ID":"bebefec0-9833-485b-9bb5-003a6faa22e2","Type":"ContainerDied","Data":"fff5a8975f47b768ea23a1397c1e9daa834837df28ff1688fc25d775d0702911"} Mar 18 15:10:06 crc kubenswrapper[4756]: I0318 15:10:06.130779 4756 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fff5a8975f47b768ea23a1397c1e9daa834837df28ff1688fc25d775d0702911" Mar 18 15:10:06 crc kubenswrapper[4756]: I0318 15:10:06.148290 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564110-tx6fp" Mar 18 15:10:06 crc kubenswrapper[4756]: I0318 15:10:06.322723 4756 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h69sc\" (UniqueName: \"kubernetes.io/projected/bebefec0-9833-485b-9bb5-003a6faa22e2-kube-api-access-h69sc\") pod \"bebefec0-9833-485b-9bb5-003a6faa22e2\" (UID: \"bebefec0-9833-485b-9bb5-003a6faa22e2\") " Mar 18 15:10:06 crc kubenswrapper[4756]: I0318 15:10:06.902479 4756 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bebefec0-9833-485b-9bb5-003a6faa22e2-kube-api-access-h69sc" (OuterVolumeSpecName: "kube-api-access-h69sc") pod "bebefec0-9833-485b-9bb5-003a6faa22e2" (UID: "bebefec0-9833-485b-9bb5-003a6faa22e2"). InnerVolumeSpecName "kube-api-access-h69sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 15:10:06 crc kubenswrapper[4756]: I0318 15:10:06.937722 4756 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h69sc\" (UniqueName: \"kubernetes.io/projected/bebefec0-9833-485b-9bb5-003a6faa22e2-kube-api-access-h69sc\") on node \"crc\" DevicePath \"\"" Mar 18 15:10:07 crc kubenswrapper[4756]: I0318 15:10:07.137430 4756 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564110-tx6fp" Mar 18 15:10:07 crc kubenswrapper[4756]: I0318 15:10:07.218919 4756 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564104-ph72j"] Mar 18 15:10:07 crc kubenswrapper[4756]: I0318 15:10:07.229156 4756 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564104-ph72j"] Mar 18 15:10:07 crc kubenswrapper[4756]: I0318 15:10:07.325944 4756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="600f14cc-1ff0-45ca-88ff-8f1fd727609a" path="/var/lib/kubelet/pods/600f14cc-1ff0-45ca-88ff-8f1fd727609a/volumes" Mar 18 15:10:11 crc kubenswrapper[4756]: I0318 15:10:11.315294 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:10:11 crc kubenswrapper[4756]: E0318 15:10:11.316098 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:10:25 crc kubenswrapper[4756]: I0318 15:10:25.315255 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:10:25 crc kubenswrapper[4756]: E0318 15:10:25.316322 4756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qvpkg_openshift-machine-config-operator(c10ecbb9-ddab-48c7-9a86-abd122951622)\"" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" podUID="c10ecbb9-ddab-48c7-9a86-abd122951622" Mar 18 15:10:35 crc kubenswrapper[4756]: I0318 15:10:35.959857 4756 scope.go:117] "RemoveContainer" containerID="eceb29f19419aefc27b24dd5ee84d167b328f162fb7994519cb8c55dc6883c82" Mar 18 15:10:40 crc kubenswrapper[4756]: I0318 15:10:40.315858 4756 scope.go:117] "RemoveContainer" containerID="42acf6894827ecfb9bffa9ddd85cc878f4fc396fd6ccd4bd5c989f7fe3c8dfa4" Mar 18 15:10:40 crc kubenswrapper[4756]: I0318 15:10:40.879507 4756 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qvpkg" event={"ID":"c10ecbb9-ddab-48c7-9a86-abd122951622","Type":"ContainerStarted","Data":"18a7686be37f2ec571dacbb1ce533b095b3d30fe97dc05426d7d883813a595ae"}